Breakthroughs in research, and the availability of new technologies to empower them, often walk hand-in-hand. In 2021, when the Aurora exascale computing system will come online at the Department of Energy’s Argonne National Laboratory, scientists will gain the capability to take on projects of unprecedented scale, scope and complexity. Leaders in the scientific community eagerly await their time on Aurora, which will offer them performance levels exceeding a billion-billion (1018) calculations per second. Compute performance of this scale not only shortens the time-to-insight for scientists grappling with exceedingly difficult problems, but it also facilitates the ability to combine workloads in ways not feasible previously.
Rick Stevens, the associate laboratory director for computing, environment and life sciences at Argonne, expressed his enthusiasm for exascale’s potential. In his view, Aurora will change the nature of the scientific process by converging various approaches.
“What excites me most about exascale systems like Aurora is the fact that we now have, in one platform and one environment, the ability to mix simulation and artificial intelligence,” said Stevens. “This idea of mixing simulation and data-intensive science will give us an unprecedented capability and open doors in research which were inaccessible before.”
Preparing for exascale
A system capable of exascale performance requires the most advanced hardware on the planet. Integrated by Cray, the system will incorporate a future generation of Intel Xeon Scalable processors, next-generation Intel Optane DC Persistent Memory, and future Intel Xe technologies. Trish Damkroger, vice president and general manager for the Technical Computing Initiative (TCI) at Intel, foreshadowed the innovative science exascale computing will enable,
“With an exascale system like Aurora, we will have the compute capacity to realistically simulate precision medicine, weather, materials science, and so much more,” she said.
The early science program
Researchers representing a gamut of scientific disciplines like engineering, biology, chemistry and physics are starting the queue for pre-production time on the exascale system through the Argonne Leadership Computing Facility’s Aurora Early Science Program (ESP), which is designed to prepare key applications for the architecture and scale of the supercomputer. The following summaries provide a snapshot of how some of the ESP projects will use Aurora’s exascale capabilities to advance the forefront of science.
Targeting new cancer treatments
While Stevens oversees several aspects of Aurora’s deployment, he also maintains a very different role spearheading the CANcer Distributed Learning Environment (CANDLE) project. CANDLE seeks new and targeted approaches for cancer diagnosis and treatment by tapping exascale computing power.
“We need the capability to predict what a complex cancer cell is going to do when exposed to a drug,” he said. “To do that, we must acquire more high-quality data to gain a greater understanding of the biology behind the process. Machine learning methods must integrate many, many sources of data to overcome that hurdle.”
Unlocking the mysteries of the brain
Nicola Ferrier, a senior computer scientist at Argonne, is partnering with researchers from the University of Chicago, Harvard University, Princeton University, and Google to understand better how the human brain develops, learns and declines due to disease. With Aurora, Ferrier plans to image the structure of the brain down to the minute details of brain cell (neuron) connections. Through that process, Ferrier and her team hope to find new approaches to understand both normal and abnormal brains.
Achieving that goal involves a much deeper understanding of the “normal” brain state. Ferrier said a major obstacle in her research is the sheer magnitude of the datasets involved.
“The big challenge we face is not just obtaining data but managing the sheer volume of it. For example, one cubic centimeter of brain tissue may sound tiny, but analysis of the imagery from that small sample can generate petabytes of data. A teeny sample like that, though, does not give us the big-picture understanding we want,” she added. “If we try to compare two entire brains or multiple brains, that’s a monumental challenge involving exabytes of data.”
Building safer and more efficient aircraft
Modern aircraft design is a multi-faceted process. Understanding “normal” airflow around an airplane involves an extremely challenging simulation and modeling endeavor. However, that complexity compounds when considering factors like turbulence, air vortices or emergencies that can impact an aircraft in flight.
Kenneth Jansen, professor of aerospace engineering at the University of Colorado, Boulder, seeks Aurora’s help in developing improved predictive models.
“Exascale computing power can resolve more complex turbulent scales, so we can provide engineers a better predictive capacity for unique flow conditions like when a rudder must compensate for a failed engine. Exascale computing also empowers us to do many lower-fidelity calculations quickly,” he said. “This process is especially important when we consider things like wing thickness, where to place flow control devices, and more. By doing thousands of these smaller-scale simulations, we can more efficiently impact an aircraft design in positive ways.”
Developing clean and plentiful energy
William Tang, principal research physicist at the Princeton Plasma Physics Laboratory, will use Aurora’s exascale capabilities to seek new ways to contain fusion reactions to generate electricity. Fusion, the reaction that fuels our sun and most stars, promises clean energy on Earth on an enormous scale. However, capturing and controlling that reaction requires innovative approaches. The incredibly high temperatures involved in fusion reactions would destroy physical barriers made of conventional materials. Therefore, Tang and his team pursue magnetic containment in fusion facilities called tokamaks.
“We have invested a lot in the effort to deliver clean fusion energy through magnetic confinement methods,” said Tang. “However, there are many barriers to overcome. One major challenge is making quick and accurate predictions regarding so-called disruptive events, which allow the superhot plasma that fuels fusion reactions to escape quickly. Supervised machine learning can help us predict when that will happen and plan to control it.”
Carbon-free fusion will offer advantages not found in today’s nuclear power plants, which split rather than fuse atoms. First, fusion reactors can operate using a common hydrogen isotope, deuterium, obtained from ordinary seawater, and its sister isotope tritium. Secondly, because the reactors harbor less than a minute’s worth of fuel in operation, the systems cannot experience an explosion or “meltdown” – the fusion reaction simply fizzles out. Finally, the fusion process in these reactors produces no long-term radioactive waste that must be sealed away for millennia to avoid environmental contamination.
Software tools for catalytic chemistry
David Bross, an assistant computational chemist at Argonne, is prepared to embrace Aurora’s enhanced capabilities to advance catalytic chemistry. Virtually all chemical processes carried out in industry today involve catalysts. Catalysts come in many forms including nanoparticles, a solution phase in a solvent, metals like platinum, or as proteins known as enzymes. Well-designed catalysts accelerate chemical reactions to produce desired products or even decelerate undesired processes.
Bross is partnering with researchers across the U.S. to develop software tools that will enable the design of new catalysts to improve industrial processes, reduce waste, and decrease energy usage.
“Developing new industrially valuable catalysts involves a molecular-level understanding of how they work,” he added. “Once we have a complete description of the thermochemistry and reaction kinetics underlying all those mechanisms, we can use that knowledge to tailor and engineer catalysts which can advance many technologies that we use.”
Simulating millions of catalyst-driven reactions, to identify the most-capable handful of catalysts, necessitates computing power on an exascale level.
“Exascale computing systems like Aurora will give the next generation of researchers new tools to solve the truly grand challenges facing the world today. We live in an inspiring time in computing history,” said Damkroger.
Rob Johnson spent much of his professional career consulting for a Fortune 25 technology company. Currently, Rob owns Fine Tuning, LLC, a strategic marketing and communications consulting company based in Portland, Oregon. As a technology, audio, and gadget enthusiast his entire life, Rob also writes for TONEAudio Magazine, reviewing high-end home audio equipment.
This article was produced as part of Intel’s HPC editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC community through advanced technology.
Exascale computing unlocks new avenues for science
Researchers have big plans for Argonne National Laboratory’s future exascale supercomputer, Aurora.
Breakthroughs in research, and the availability of new technologies to empower them, often walk hand-in-hand. In 2021, when the Aurora exascale computing system will come online at the Department of Energy’s Argonne National Laboratory, scientists will gain the capability to take on projects of unprecedented scale, scope and complexity. Leaders in the scientific community eagerly await their time on Aurora, which will offer them performance levels exceeding a billion-billion (1018) calculations per second. Compute performance of this scale not only shortens the time-to-insight for scientists grappling with exceedingly difficult problems, but it also facilitates the ability to combine workloads in ways not feasible previously.
Rick Stevens, the associate laboratory director for computing, environment and life sciences at Argonne, expressed his enthusiasm for exascale’s potential. In his view, Aurora will change the nature of the scientific process by converging various approaches.
“What excites me most about exascale systems like Aurora is the fact that we now have, in one platform and one environment, the ability to mix simulation and artificial intelligence,” said Stevens. “This idea of mixing simulation and data-intensive science will give us an unprecedented capability and open doors in research which were inaccessible before.”
Preparing for exascale
A system capable of exascale performance requires the most advanced hardware on the planet. Integrated by Cray, the system will incorporate a future generation of Intel Xeon Scalable processors, next-generation Intel Optane DC Persistent Memory, and future Intel Xe technologies. Trish Damkroger, vice president and general manager for the Technical Computing Initiative (TCI) at Intel, foreshadowed the innovative science exascale computing will enable,
Join us Jan. 27 for our Industry Exchange Cyber 2025 event where industry leaders will share the latest cybersecurity strategies and technologies.
“With an exascale system like Aurora, we will have the compute capacity to realistically simulate precision medicine, weather, materials science, and so much more,” she said.
The early science program
Researchers representing a gamut of scientific disciplines like engineering, biology, chemistry and physics are starting the queue for pre-production time on the exascale system through the Argonne Leadership Computing Facility’s Aurora Early Science Program (ESP), which is designed to prepare key applications for the architecture and scale of the supercomputer. The following summaries provide a snapshot of how some of the ESP projects will use Aurora’s exascale capabilities to advance the forefront of science.
Targeting new cancer treatments
While Stevens oversees several aspects of Aurora’s deployment, he also maintains a very different role spearheading the CANcer Distributed Learning Environment (CANDLE) project. CANDLE seeks new and targeted approaches for cancer diagnosis and treatment by tapping exascale computing power.
“We need the capability to predict what a complex cancer cell is going to do when exposed to a drug,” he said. “To do that, we must acquire more high-quality data to gain a greater understanding of the biology behind the process. Machine learning methods must integrate many, many sources of data to overcome that hurdle.”
Unlocking the mysteries of the brain
Nicola Ferrier, a senior computer scientist at Argonne, is partnering with researchers from the University of Chicago, Harvard University, Princeton University, and Google to understand better how the human brain develops, learns and declines due to disease. With Aurora, Ferrier plans to image the structure of the brain down to the minute details of brain cell (neuron) connections. Through that process, Ferrier and her team hope to find new approaches to understand both normal and abnormal brains.
Achieving that goal involves a much deeper understanding of the “normal” brain state. Ferrier said a major obstacle in her research is the sheer magnitude of the datasets involved.
“The big challenge we face is not just obtaining data but managing the sheer volume of it. For example, one cubic centimeter of brain tissue may sound tiny, but analysis of the imagery from that small sample can generate petabytes of data. A teeny sample like that, though, does not give us the big-picture understanding we want,” she added. “If we try to compare two entire brains or multiple brains, that’s a monumental challenge involving exabytes of data.”
Building safer and more efficient aircraft
Modern aircraft design is a multi-faceted process. Understanding “normal” airflow around an airplane involves an extremely challenging simulation and modeling endeavor. However, that complexity compounds when considering factors like turbulence, air vortices or emergencies that can impact an aircraft in flight.
Read more: Commentary
Kenneth Jansen, professor of aerospace engineering at the University of Colorado, Boulder, seeks Aurora’s help in developing improved predictive models.
“Exascale computing power can resolve more complex turbulent scales, so we can provide engineers a better predictive capacity for unique flow conditions like when a rudder must compensate for a failed engine. Exascale computing also empowers us to do many lower-fidelity calculations quickly,” he said. “This process is especially important when we consider things like wing thickness, where to place flow control devices, and more. By doing thousands of these smaller-scale simulations, we can more efficiently impact an aircraft design in positive ways.”
Developing clean and plentiful energy
William Tang, principal research physicist at the Princeton Plasma Physics Laboratory, will use Aurora’s exascale capabilities to seek new ways to contain fusion reactions to generate electricity. Fusion, the reaction that fuels our sun and most stars, promises clean energy on Earth on an enormous scale. However, capturing and controlling that reaction requires innovative approaches. The incredibly high temperatures involved in fusion reactions would destroy physical barriers made of conventional materials. Therefore, Tang and his team pursue magnetic containment in fusion facilities called tokamaks.
“We have invested a lot in the effort to deliver clean fusion energy through magnetic confinement methods,” said Tang. “However, there are many barriers to overcome. One major challenge is making quick and accurate predictions regarding so-called disruptive events, which allow the superhot plasma that fuels fusion reactions to escape quickly. Supervised machine learning can help us predict when that will happen and plan to control it.”
Carbon-free fusion will offer advantages not found in today’s nuclear power plants, which split rather than fuse atoms. First, fusion reactors can operate using a common hydrogen isotope, deuterium, obtained from ordinary seawater, and its sister isotope tritium. Secondly, because the reactors harbor less than a minute’s worth of fuel in operation, the systems cannot experience an explosion or “meltdown” – the fusion reaction simply fizzles out. Finally, the fusion process in these reactors produces no long-term radioactive waste that must be sealed away for millennia to avoid environmental contamination.
Software tools for catalytic chemistry
David Bross, an assistant computational chemist at Argonne, is prepared to embrace Aurora’s enhanced capabilities to advance catalytic chemistry. Virtually all chemical processes carried out in industry today involve catalysts. Catalysts come in many forms including nanoparticles, a solution phase in a solvent, metals like platinum, or as proteins known as enzymes. Well-designed catalysts accelerate chemical reactions to produce desired products or even decelerate undesired processes.
Bross is partnering with researchers across the U.S. to develop software tools that will enable the design of new catalysts to improve industrial processes, reduce waste, and decrease energy usage.
Want to stay up to date with the latest federal news and information from all your devices? Download the revamped Federal News Network app
“Developing new industrially valuable catalysts involves a molecular-level understanding of how they work,” he added. “Once we have a complete description of the thermochemistry and reaction kinetics underlying all those mechanisms, we can use that knowledge to tailor and engineer catalysts which can advance many technologies that we use.”
Simulating millions of catalyst-driven reactions, to identify the most-capable handful of catalysts, necessitates computing power on an exascale level.
“Exascale computing systems like Aurora will give the next generation of researchers new tools to solve the truly grand challenges facing the world today. We live in an inspiring time in computing history,” said Damkroger.
Rob Johnson spent much of his professional career consulting for a Fortune 25 technology company. Currently, Rob owns Fine Tuning, LLC, a strategic marketing and communications consulting company based in Portland, Oregon. As a technology, audio, and gadget enthusiast his entire life, Rob also writes for TONEAudio Magazine, reviewing high-end home audio equipment.
This article was produced as part of Intel’s HPC editorial program, with the goal of highlighting cutting-edge science, research and innovation driven by the HPC community through advanced technology.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Related Stories
Energy Dept. shoots for exascale computer in a national lab by 2021
Nuclear stockpile agency enters multimillion-dollar super computer agreement
Paul Messina: Race to develop the first exascale computer