As difficult as it’s been for the Defense Department to make widespread use of commercial cloud services for its business and enterprise IT needs over the past decade, delivering cloud services to warfighters at the tactical edge is even more complex. But the Army may be on the verge of doing so, and it’s determined to do it with a multi-cloud approach.
The service has already begun a series of tactical cloud pilots to help demonstrate how to build a tactical cloud architecture and begin migrating the applications soldiers use at command posts into a distributed cloud environment.
By the end of this fiscal year, officials hope to have refactored their mission command applications to work in a hybrid cloud environment, and to have at least demonstrated the capability to continuously update them via a DevSecOps pipeline. By a year later, the hope is to orchestrate multiple commercial clouds and private tactical clouds together to start to deliver battlefield intelligence and data fusion in ways that have never been done before.
Insight by RavenTek: Explore how infrastructure visibility is the first requirement for maintaining best performance in this exclusive executive briefing.
If the experiments prove successful, they could significantly change the way the Army thinks about collecting and using data in tactical environments, said Ken Lorentzen, the product lead for the Mission Command Support Center within the Army’s Program Executive Office for Command, Control and Communications-Tactical (PEO C3T).
“Right now, a lot of those command posts aren’t persistent. If they’re a maneuver unit and they’re going to operate for a day or a few hours, they’ll shut down and move someplace else and come back up,” he said during a keynote session during Federal News Network’s DoD Cloud Exchange. “But now we can start to look at the way our command post computing environment can lean into persistent computing in the cloud … now we can go to the cloud and put a master node up there and do all of our data fusion, and then push it down, along with analytics that we never were able to do, because we were starved of information locally, or a unit that’s trying to drive around in the middle of the woods or the desert just didn’t have the time or storage and computing to do it locally. Now we can start looking at these things with a bigger appetite, and start tackling some of these problems that we wouldn’t have considered otherwise.”
But for now, how big that appetite should be is far from clear. Even though cloud hosting facilities based in the U.S. could offer essentially limitless computing and storage power, the Army will still need to rely to varying degrees, depending on the situation, on smaller tactical edge nodes and regional hubs. And soldiers will still need mission command and intelligence capabilities when they have little or no connectivity to the cloud, and have limited space and electricity capability within their formations.
For those reasons, the Army needs to make some key management decisions about what data belongs in its tactical cloud infrastructure, said Alexander Miller, a senior advisor for science and technology in the office of the Army’s deputy chief of staff for intelligence.
As of now, the DoD intelligence and tactical data the Army relies on is spread across multiple systems that lack a common data fabric, and simply dumping all of it into a cloud environment would likely be counterproductive.
“We used to say, ‘Hey, all the data matters.’ And I would augment that with, ‘All of the data matters at some time,’” Miller said. “Data has a lifecycle and a lifespan, and most of it is very perishable. What we don’t want to do is try to move all of the data all of the time, or store all the data all of the time, especially from the edge back. That would generate a requirement for massive storage, or a ton of bandwidth, and the ability to move bits and bytes over the network all the time isn’t there, especially if we’re going to be in an anti-access/area denial type of environment.”
One way the Army thinks it can help solve the problem is by implementing “smarter” sensors and processors on the battlefield, using technology that can help filter signal from noise, sending only the most relevant data back to the tactical cloud.
“We don’t want to take all the data and try to mash it onto a dashboard. That’s interesting, but it’s not compelling,” Miller said. “We want to use sensors that have on-board processing, which is a provisioned service from the cloud, we want edge nodes that are provisioned off of the cloud with cloud-enabled analytics so that we can reduce those data. And then we want to use a lot of the cloud-native architecture that’s been pushed forward to enable that.”
To complicate the picture further, in any real-world battlespace, the Army won’t be relying only on its own data – but also intelligence and targeting information that comes from other military services’ sensors. The Pentagon has already begun to wrestle with those departmentwide challenges via its Joint All-Domain Command and Control (JADC2) initiative.
The Army is contributing to helping solve that bigger data management challenge via its Project Convergence experiments, the service’s main contribution toward JADC2, said Dr. Portia Crowe, the chief data officer for the Army’s Network Cross-functional Team.
“What we’ve done taken in the mission threads — joint fires, for example — and broken that down into use cases. So it is a visual flow of how data gets across the battlespace, from air to ground,” she said. “It looks at what data’s exchanged, how it’s being exchanged. If we’re going to, for example, exchange data with an Air Force F-35, where does that need to come into the Army, and what is the best way to do that? So we’ve been using cloud as our way to actually exchange that data, and we do it fast, and we do it real-time and we do it securely. We’re learning as we go, and it’s opened up a lot of opportunities in the space. It’s a new avenue to exchange data in these big learning campaigns between the services.”
But even as the Army works its way through data management challenges and pilots tactical clouds that could soon appear on the battlefield, it’s already discovered an extremely compelling use case for cloud in the context of training soldiers to use its mission command systems.
If those command post applications can be refactored to work in cloud environments — and many of them already have — soldiers can train on the latest and greatest applications at their home stations or reserve component drilling sites immediately, long before their units receive the physical hardware that will connect to those applications in a command post environment.
Typically, it takes the Army five years to fully outfit every unit with a refreshed set of networking equipment. But if units who aren’t actually deployed can train on cloud-hosted versions of the same systems, the Army doesn’t actually need to outfit every unit with physical gear. In theory, that could accelerate the refresh cycle for the physical hardware soldiers deploy with to once every two years.
Lorentzen said the Army has already started to adopt that model for National Guard and Reserve units. They’ll receive new equipment if their unit is preparing to deploy. Otherwise, they’ll train largely in the cloud.
“That way, I don’t have five- or six-year-old servers sitting in closets out there, and at the same time, we can have the latest and greatest available to everybody — they all train on the capability without having to wait for a refresh cycle. That’s huge,” he said.
As an early proof of concept, the Army made an early version of its cloud-hosted command post computing environment available to the Command and General Staff College and other schoolhouses overseen by Training and Doctrine Command. About 500 students are training on the cloud version of the CPCE as of now, including some who are distance learners.
“It’s just great to watch, because I just have to take notes and say, ‘Okay, this works, this doesn’t,’ and then put the things that work on contract and make it a capability that everybody can get. We can work at pretty large scale, we can work on networks that are pretty reliable, and we can take what we have and really evolve it. We’re talking to operational units, and they want what we have as well. It’s a great time to be here working on this — I can only emphasize that where we’re going is going to be a fantastic place in a few years.”