...
When it comes to software development, the Army is going to stop worrying about the color of money.
That’s because as part of its new approach to software modernization, the Army is rethinking what sustainment means.
Margaret Boatner is the deputy assistant secretary of the Army for strategy and acquisition reform, said one of the main tenets of the policy signed by Army Secretary Christine Wormuth in March is to reform several legacy processes that is keeping the service from adopting modern software development approaches.
“We are targeting a couple of really key processes like our test and evaluation processes, and importantly, our cybersecurity processes. We really are trying to modernize and streamline those as well as changing the way we think about sustainment because software is really never done. We really have to retrain ourselves to think about and to acknowledge the fact that software really needs to stay in development all the time,” Boatner said in an exclusive interview with Federal News Network. “Right now, our systems and our acquisition programs, once they’re done being developed, they go through a process that we call transition to sustainment, meaning they’ve been fully developed and are now going to live in our inventory for 10, 20, 30 years. We’re going to sustain them for a long period of time. When a system makes that transition, the financial management regulations dictate that they use a certain color of money, operations and maintenance dollars. With that color of money, we can really only do minor patches, fixes and bug updates. So that’s an example of a legacy process that, when you’re talking about a software system, really tied our arms behind our back. It really prevented us from doing true development over the long term with the software solutions.”
Boatner said under the new policy, software will no longer make the transition to sustainment. Instead, the program office will keep operating under research, development, test and evaluation (RDT&E) funding.
“It’s recognizing that a continuous integration/continuous delivery (CI/CD) model software is never done. That way, our program managers can plan to use the appropriate color of money, which in many cases might be RDT&E, which is the color money you need to do true development,” she said. “So, that will give our program managers a lot more flexibility to determine the appropriate color money based on what they want to do, such that our software systems can really continue to be developed over time.”
The Army has been on this path to software modernization path for several years, with it culminating with the March memo.
With the lessons from the 11 software pathways to testing out a new approach to a continuous authority to operate to the broad adoption of the Adaptive Acquisition Framework, Boatner and Leo Garciga, the Army’s chief information officer, are clearing obstacles, modernizing policies and attempting to change the culture of how the Army buys, builds and manages software.
Garciga said by keeping programs under the RDT&E bucket, the Army is recognizing the other changes it needs to complete to make these efforts more successful.
“We need to relook at processes like interoperability. Historically, that was not a parallel process, but definitely a series process. How do we change the way we look at that to bring it into this model where we’re developing at speed and scale all the time?” he said. “I think we’re starting to see the beginnings of the second- and third-order effects of some of these decisions. The software directive really encapsulated some big rocks that need to move. We’re finding things in our processes that we’re going to have to quickly change to get to the end state we’re looking for.”
Since taking over the CIO role in July, Garciga has been on a mission to modernize IT policies that are standing in the way. The latest one is around a continuous ATO (C-ATO).
He said the new policy could be out later this summer.
“We’ve told folks to do DevSecOps and to bring agile into how they deliver software, so how do we accredit that? How do we certify that? What does that model look like? We’re hyper-focused on building out a framework that we can push out to the entire Army,” Garciga said. “Whether you’re at a program of record, or you’re sitting at an Army command, who has an enterprise capability, we will give some guidelines on how we do that, or at least an initial operational framework that says these are the basic steps you need to be certified to do DevSecOps, which really gets to the end state that we’re shooting for.”
He added the current approach to obtaining an ATO is too compliance focused and not risk based.
Garciga highlighted a recent example of the barriers to getting C-ATO.
“We started looking at some initial programs with a smart team and we found some interesting things. There was some things that were holding us back like a program that was ready to do CI/CD and actually could do releases every day, but because of interoperability testing and the nature of how we were implementing that in the Army, it was causing them to only release two times a year, which is insane,” he said. “We very quickly got together and rewickered the entire approach for how we were going to do interoperability testing inside the Army. We’re hoping that leads to the department also taking a look at that as we look at the joint force and joint interoperability and maybe they follow our lead, so we can break down some of those barriers.”
Additionally, the Army undertook a pilot to test out this new C-ATO approach.
Garciga said the test case proved a program could receive at least an initial C-ATO in less than 90 days by bringing in red and purple teams to review the code.
“I’d say about three months ago, we actually slimmed down the administrative portion and focused on what were the things that would allow us to protect our data, protect access to a system and make a system survivable. We really condensed down the entire risk management framework (RMF) process to six critical controls,” he said. “On top of that, we added a red team and a purple team to actually do penetration testing in real time against that system as it was deployed in production. What that did is it took our entire time from no ATO to having at least an ATO with conditions down to about less than 90 days. That was really our first pilot to see if we can we actually do this, and what are our challenges in doing that.”
Garciga said one of the big challenges that emerged was the need to train employees to take a more threat-based approach to ATOs. Another challenge that emerged was the Army applied its on-premise ATO approach to the cloud, which Garciga said didn’t make a lot of sense.
“We put some new policy out to really focus on what it means to accredit cloud services and to make that process a lot easier. One of our pilots, as we looked at how do we speed up the process and get someone to a viable CI/CD pipeline, we found things that were really in the way like interoperability testing and how do we get that out of the way and streamline that process,” he said. “In our pilots, the one part that we did find very interesting was this transition of our security control assessors from folks that have historically looked at some very specific paperwork to actually now getting on a system and looking at code, looking at triggers that have happened inside some of our CI/CD tools and making very difficult threshold decisions based on risk and risk that an authorizing official would take to make those decisions. We’re still very much working on what our training plan would be around that piece. That’ll be a big portion of how we’re going to certify CI/CD work and DevSecOps pipelines in the Army moving forward.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED