There is no computing environment more edge than the International Space Station. It currently takes weeks or even months to send the massive amounts of data produced by research on the ISS down to Earth for analysis and processing. NASA is currently gearing up for a return to the moon, to eventually be followed by a manned mission to Mars, and the latency of data transmission on those missions will be – pardon the pun – astronomical. So it’s strategically important to begin experimenting with edge computing in space now.
Toward that end, several vendors came together to create an edge computing solution to aid astronauts currently on the ISS conduct genetic research. Astronauts identify and study microbes in the air on the ISS to help prepare for future missions, but until now, all they could really do was collect data and send it back to Earth.
The analytical code itself sits in those containers, and can be pushed to ISS as needed. Astronauts can then run the analysis themselves, getting real-time results while also sharing them with experts on the ground.
And that’s just one use case among many for edge computing. Smart city efforts are working to control traffic lights based on real-time patterns, easing congestion and commutes. Doctors are experimenting with remote diagnoses and even remote surgery, which have applications in environments ranging from the current pandemic to battlefields. The U.S. Geological Survey has soil sensors that detect chemicals in the ground and transmit that data for analysis.
One thing all those use cases have in common is that the faster people in the field can get that data analyzed, the better decisions they can make, and the easier everyone’s lives become.
“That’s what the Internet of Things’ purpose was in the beginning. It’s to make people’s lives easier, to make things that usually take a long time, or are inconvenient seamless and usable,” Anne Dalton, data science and edge computing solutions specialist at Red Hat, said. “And so that’s exactly what edge computing is doing. It’s taking the lessons learned from IoT and learning how to integrate technology itself into the IoT device, which is kind of a novel flip on that story.”
The problem is that everyone is used to developing in an enterprise data center, that traditional cloud environment. But the closer you get to the edge, the less infrastructure you have to build on. The Defense Department typifies that problem with one common question: “What can you fit on the back of a Humvee?” There are many good answers to that question, but an entire data center is not one of them.
“So what we’re actively doing, as you get closer to the device edge, we’re making that footprint smaller and smaller and smaller, so that you can run the same type of information or the same type of analyses locally, but you’re not having to store all of that data at the edge,” Dalton said. “You can run and you can process it, you store what’s necessary. And then imagine like how you plug your phone in at night, and it goes through the update. So then you can kind of take that information, and you can put it back into your cloud environment or your core data center. But you don’t have to have that huge footprint, when the device is often really small.”
In that way, it’s very much like the cloud computing version of RAM versus storage in a desktop computer. There always has to be somewhere to offload the data, because otherwise all you’re doing is building data centers closer to the edge. But by keeping the footprint small, you enable the analysis to be quick and as close as possible to the end users who need it. It’s a new way to interact with and use the cloud.
The first thing agencies should do, Dalton said, is to examine the problems they’re trying to solve. Many have edge computing needs, but they don’t always call them that. They think of these problems in terms like “remote office” or “autonomous vehicle.”
“If they can answer that question and say, ‘Yes, we definitely think that this is something where we need to do this closer to where the information is, I think the first thing to do is start having those conversations and start engaging with the teams that they’re working with,” Dalton said. “And the integrators or the vendors like Red Hat, where they can say ‘we think we’re having this issue, can you help’ and that’s when we can kind of come in and take a look at their environment and take a look at what modernization looks like for them and what moving to something like the edge would be like and how it might help them.”