Sponsored by Red Hat

Edge computing enables NOAA to push workloads closer to public consumers, not just field researchers

Frank Indiviglio, NOAA’s deputy director for High Performance Computing & Communications (HPCC), said conversations are happening about how NOAA can containerize...

The National Oceanic and Atmospheric Administration operates in mission spaces that extend from the bottom of the ocean to outside of Earth’s atmosphere. But fundamentally, it’s a data organization first and foremost. It deploys sensors in all those varied environments in order to gather data that can then be analyzed and packaged into products that help to protect citizens and industries from the volatility of the weather. Now, edge computing is making it possible to not only conduct more analysis of that data in the field, yielding more immediate insights, but also to better engage the community of stakeholders across academia, industry and the general public who are the primary consumers of NOAA’s data.

Most agencies whose primary mission revolves around the collection of data, such as the Intelligence Community or certain healthcare agencies, reserve the data for internal government consumption due to privacy or national security considerations, publishing only analysis and guidance. But NOAA is the opposite. All of the data it collects is intended for public products, and just as much analysis happens outside the agency as within. In essence, NOAA has two centers of data gravity: One at the sensors, where the data is collected, and one in the public sphere, where the data is consumed.

And it’s looking to edge computing to push workloads in both directions.

That’s why it’s looking to move to a community-based modeling approach. Frank Indiviglio, NOAA’s deputy director for High Performance Computing & Communications (HPCC), said conversations are happening about how NOAA can containerize its climate models in order to push the models themselves out to the public to understand, build upon and tweak.

“Those tool sets and software environments, that’s going to drive innovation,” he said. “Hardware enables innovation, but software, I think, is really the layer that we’re going to need to focus on and push, not only to modernize, but to get those tools out there in the public sphere so that we can all get better together, for lack of a better term.”

For example, Indiviglio said that as hardware platforms get smaller, edge computing is enabling the use of what are essentially supercomputers in places he couldn’t even have conceived of ten years ago. Artificial intelligence is enabling NOAA’s fisheries to do genomic sequencing in the field, and identify fish species in the water without a diver. AI is also improving the accuracy of forecasts, leading to a better product and freeing up people to do more science. Autonomous platforms are increasing the amount of data NOAA can gather from inside hurricanes while reducing the need for people to put themselves in harm’s way by flying into them.

And that requires a heavy focus on the data itself, and how it’s transmitted and disseminated.

“It’s integrity,” Indiviglio said. “We want to make sure that the data that we disseminate to the public sphere is genuine data that was produced by our science and that people can trust. What you don’t want is either data that gets ingested or put out into the public that has some kind of a question mark on its integrity.”

One challenge is the number of system boundaries that NOAA’s data has to cross. Indiviglio said it goes from the observation system, where it’s collected by the sensor, to the HPCC system where it’s processed, analyzed and becomes part of a forecast. Then it moves into data distribution. Moving data across all those systems while maintaining integrity is a challenge that he said requires the modernization of platforms. But he also said NOAA benefits from a long history of prioritizing data fundamentals and interoperability, which left the agency well positioned to not only streamline those processes, but begin to layer AI on top of them.

That’s why the cloud was such a good fit for NOAA. The agency has always had a mobile workforce, with end users on both ships in the Arctic and at stations in the Antarctic, and ensuring the ability to securely transfer data from those locations was paramount. Now they’re leveraging the cloud to provide more flexibility around workloads. NOAA’s supercomputer programs have continued to grow, far beyond the capacity to provide the required compute through hardware. The cloud allows the agency to provide that flexible capacity to those remote workstations and cut down the weeks and months of latency that used to be commonplace. The workloads are able to gravitate toward the mass of the data.

“You can get compute to people who are remote and it would be kind of limited in the past. We can build environments that go to them,” Indiviglio said. “And we can, now with technology, we can get a lot more data and take a lot of people out of harm’s way to get that data. So it’s kind of a win-win on both sides, right? So you can do more analysis in the field, you can certainly process more in the field, but you can also get a lot more [data], which is a good thing.”

 

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories