LANL is building computer systems to take on the "unsolved problems," according to the leader of the High Performance Computing division.
Experts in the High Performance Computing division at Los Alamos National Laboratory are working at the edge of some of the biggest “big data” challenges in the world.
The problems LANL works on are not run of the mill data analysis: Climate change and nuclear reactions are just some of the issues that scientists at the lab analyze using complex simulations and massive supercomputers. Gary Grider, the leader of the HPC division at Los Alamos, said the lab is designing computing systems for “unsolved problems.”
“We’re working in a fair amount of areas — memory bandwidth and smart memory control mechanisms, things like that — to be able to build machines in the future that enable us to solve problems that we can’t contemplate solving today,” Grider said in an interview.
“We have people where that’s what they do,” he continued. “They contemplate, how can this technology and that technology in this timeframe come together to be able to finally solve this problem we’ve been waiting for 20 years to solve?”
The data management requirements at Los Alamos are driven by the enormous computational challenges, meaning the lab’s primary datasets are contained to those purpose-built supercomputers.
“We build networks that are parallel enough to move data many terabytes a second, but moving it to and from the cloud or within the cloud would be very expensive,” Grider said. “So for that major workload, we don’t really consider cloud to be a thing.”
“However, we steal from the cloud technology base all the time,” he added.
For instance, Los Alamos’ “Charliecloud” project gives laboratory researchers the ability to build Linux software containers on their own. That provides users the ability to securely build and run their own applications for various scientific research, without the intervention of HPC staff.
Los Alamos also makes use of Kubernetes, an open-source system for automating deployment, scaling and management of containerized applications. And it’s developed modified cloud-based erasure technology to help manage its own complex data sets, Grider said.
The lab also uses the cloud to access outside technologies, such as the burgeoning world of quantum computing systems.
“Most everybody that has quantum access today gets it in the cloud,” Grider said.
Los Alamos additionally works with customer agencies ranging from the Defense Department to the National Institutes of Health. Many of those agencies want Los Alamos to work with data that’s already in the cloud.
“So we use cloud in that space,” Grider said. “And that runs the gamut from completely open cloud to secure clouds for DoD and other places.”
Moving forward, Los Alamos is exploring computational storage technologies, where more data crunching tasks are done at the storage device itself, obviating the time and energy it takes to move loads of data from the device to a supercomputer.
“With our data, the gravity is so huge, that we don’t even want to move away from the storage devices themselves,” Grider said. “If you have an exabyte out there, it would take you a month just to move that data. And if you can move three orders of magnitude less, we’re talking about an hour. That’s a big difference when you’re a scientist, and you’re looking for an answer.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Follow @jdoubledayWFED