4 speakers
On Demand
The 10-plus year journey agencies have been on to move applications and workloads to the cloud has been filled with fits and starts.
Some agencies are moving quickly to the cloud. As Maria Roat, the former chief information officer at the Small Business Administration and federal deputy CIO, famously said, “I’m burning the bridges behind me.”
Other agencies with legacy systems that run major mission-critical systems are taking a much more measured approach.
The legacy system challenge is one agencies still must confront. A June 2023 report from the Center for Strategic and International Studies found older systems are more expensive to maintain and easier to hack, but agencies fear service disruption or data loss in any transition from legacy systems as a main reason why cloud adoption can be slow.
It’s not easy to retire systems that have been successful for the mission for the last few decades, and when those systems run on mainframes, that decision becomes even more difficult because these older technologies often generate technical debt that squeeze budgets, squelches modernization initiatives and directs funds away from what drives innovation.
Adarryl Roberts, the chief information officer of the Defense Logistics Agency, said while DLA has moved most of its systems and applications to the cloud over the last five or six years, two mission-critical ones remain on mainframes.
He said the distribution standard systems and the federal logistics information system, or catalog system, are in the process of going to the cloud over the next two years.
“We still have a green screen with our mainframe on our distribution system, but that’s going away. I think with using these new capabilities in the cloud and other formats, what’s most important is the methodology that you use to deploy these systems for that user experience, and doing that in an agile nature,” Roberts said on the discussion Level-up in the cloud. “What I mean by that is not waiting until you’re deployed to get the feedback, but actually having the end user be part of the design, development and testing of that effort. We’re building a new portal internal for our employees. We’ve had over 200 individuals from across the agency, representing each part of the business, initially come in and tell us from a day-to-day work experience perspective, ‘how would you like to log in and work at DLA’ as we deployed these capabilities, and that really drove our design of that capability that we have in the cloud.”
By setting up the structured human-centered design and structured governance processes, DLA is ensuring employees are not only ready for the change, but excited for it.
The Army, still early in its cloud journey, is taking a measured approach starting with all new capabilities automatically going into the cloud. For other systems, even those already in the cloud, Leo Garciga, the chief information officer for the Army, said the goal is to figure out how best to optimize the technology to meet mission goals.
“As far as the legacy systems, it continues to be this massive balance between the return on investment on moving to the cloud just because of how a legacy system may be an interruption to operations,” Garciga said. “I think where we’re seeing a lot of success in this space is really within our business systems, where we have a clearly laid out plan over the next five or six years about what the next generation of systems will look like, and a real decomposition of what is starting to replace the legacy systems, which has actually given us a lot of trade space to not spend a lot of time and effort trying to maybe move things to the cloud, where we’re already going to be delivering natively. I think the larger challenge continues to remain everything else that’s out in the environment and how to take those opportunities.”
The cost and benefit discussion starts by understanding an organization’s inventory of systems, including the cyber posture and the application’s architecture.
Garciga said the Army has done a lot of work to standardize architectures and deployment templates.
Roberts added the need to reshape an organization and understand the unique processes versus the common ones will help determine which legacy systems make sense to move to the cloud and which ones don’t.
For many agencies who have already moved several legacy systems to the cloud, the question is how do they optimize and continue to modernize those systems?
Noel Hara, the public sector chief technology officer for NTT DATA, said mainframe systems can be easily picked up and put in the cloud without a lot of effort, and the agency will see benefits including scaling and cybersecurity improvements.
“What we’ve typically been working with our clients around is getting them to the cloud, but not stopping there,” Hara said. “It’s a phase two activity to then take and deconstruct that application so it might be the first year put you in the cloud, the next couple of years, deconstruct that application, turn it into a cloud native modern application, take advantage of all of the tools that you get from the cloud, but fund that with the savings that you’re going to get getting off of your legacy infrastructure, not having to renew those on-premise assets and continue those costs.”
As agencies become more comfortable with understanding cloud spending by implementing FinOps, Hara said the application development team and the cloud team must work more closely together so as not to miss out on cost savings.
“We really look at how we can do this in bite-sized chunks and move at the speed that the mission requires. It doesn’t mean that you have to do these five-year massive programs in order to meet your goals. That’s the biggest challenge we try to get in front of the procurement folks to say we are going to completely refactor this application or build it new,” Hara said. “You really need to look at what pieces of this app, the modern tooling and the technology allows you to break this up. Maybe I’m just going to take the front end, maybe I’m just going to take the back end, maybe I’m just going to take the analytics engine of it right now and move that to the cloud and then deconstruct this thing and get it there over time.”
Garciga said changing that legacy mindset, whether it’s acquisition folks or technology developers or industry, is an important shift as part of the cloud adoption.
“It’s really those legacy ones that we have to figure out how we build these bridges that allow us to use the software acquisition pathways and really add that agility to deployment and the acquisition cycle to get stuff done,” he said.
DLA, meanwhile, is taking on this iterative development approach through the use of application programming interfaces (APIs).
Roberts said the agency’s API strategy and library ensures uniformity as modernization occurs.
“Everyone’s not starting over building APIs to include our mainframe systems. Then we also conducted a business transformation study approximately three or four years ago, which actually told us, here’s some industry best products, commercial off the shelf (COTS) solutions that can actually do things that you’re doing on a mainframe that better align to our overall enterprise resource planning (ERP) strategy that we have here at DLA,” he said. “We have a very integrated ERP, where we leverage not just the financial but our logistics, some of our procurement, and so it made sense from a business end-to-end perspective to streamline that as much as possible utilizing COTS products.”
Learning objectives:
Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.