6 speakers
Oct 4, 2022 2:00 p.m. ET
Duration: 1 hour
Cost: No Fee
Agencies are expected to spend more than $18.6 billion dollars a year on cloud services by fiscal 2024. That’s the latest estimate from market research firm Deltek.
That is up from $14.5 billion in 2022.
One other big trend Deltek found was the move of DevSecOps processes to the cloud.
All of this means agencies are putting more data into these instances and that could pose a series of new challenges.
Among those challenges agencies need to deal with is how to access that data and share it among the different cloud-based systems, and then how to secure that data at rest and in transit.
The use of application programming interfaces (APIs) is a big part of this puzzle. APIs can help agencies access data from disparate clouds to better drive decisions. APIs can bring together legacy systems that otherwise can’t talk or share their data.
But at the same time, a recent report from Gartner says APIs are well on their way to becoming the main attack vector in 2022.
Agencies must take steps to ensure their applications are secure and flexible to help unleash their data and workforce’s talent.
Gary Parker, a cloud architect at the U.S. Postal Service, said his organization’s focus on modernization is about redesigning and securing legacy applications.
“We are taking our enterprise legacy services, our big hitters, and we’re decomposing them and we’re re architecting them into microservices,” Parker said during the discussion Maximizing Security and Flexibility in the Orbit of a Cloud Migration “We’re not focused on a single cloud. We started small with a conversational AI application for passport inquiries, and then, from there, we developed organizational policies and procedures. Then we moved on to what we call our get it right initiatives. These are our big initiatives that we need to focus on for delivering for America. Those would include, for example, our package tracking API work. We’re very close to having that ready. The COVID test kit initiative from the White House, we built that end-to-end in the cloud.”
The 19th Air Force is taking a bit of a different approach. They are not modernizing existing systems, but creating new approaches to pilot training.
Brian Kirk, the deputy pilot training transformation IT lead and senior software engineer for the 19th Air Force, said instead of providing training through a desktop application, the goal is to make it mobile.
“We have a lot of pilots that move around so having a static home base network isn’t real helpful for them as they move through the pilot training. We are building all of this strategy on an API-first methodology so that we can bring in applications and remove applications without affecting any of the other partners, programs and applications that we’ve got,” he said. “We’ve got it in its very infantile stages. At this point, we have introduced some students into the program, more of the content management side versus the full learning management aspect of it. But hopefully, this this coming January, we will be pretty much running full steam in our first program.”
Kirk said the API-first approach is important because the Air Force is collecting pilot training data from multi bases across the country. He said getting all of that in sync to verify the progress of pilot trainees couldn’t happen without the cloud and APIs.
“As we’re training these pilots, we’re flying to distant locations, not necessarily 100% of the time to another military base, so they will stay overnight at that location and come back the next day. Well, traditionally, the instructors have to write everything down and then when they get back to the home base, they find the computer that’s got the application on it and they insert it there,” he said. “We’re just trying to free up some of that so that you know the best memory is the freshest memory. Rather than waiting an entire day or several hours or even several days potentially to get that information into the system, we’re trying to make it available to them pretty much anywhere they’re at that they have an internet connection, and the cloud gives us that capability.”
Lt. Col. Kim Hoffman, the division chief of innovation and technology for 19th Air Force, said the improved feedback is leading to better pilots.
“The cloud in the infrastructure that we’re building lets students and instructors go back and review those notes, review those grade sheets, review any videos from the virtual reality devices that we’ve used our immersive training devices, they can, they can view that real time on the road. They can even go in and practice the scenario before they’re back out there,” Hoffman said. “Now when they’re preparing for their return flight back home on Sunday, they’ve already gone through those motions and flown it in an actual simulated environment as opposed to just sitting there and running through it step by step. So being able to access this data in real time will help both the students and the instructors. It also feeds back into the system on that data analyst side of the house of see how many times this student practiced while they were at home on their own time see how much better their grades are, or how worse they are. That feeds back into our syllabus and our courseware, the type of videos are we providing them and what kind of devices are we leaning into for the next iteration.”
She said all of that data is helpful to improve all aspects of pilot training.
Alexis Bonnell, emerging technology evangelist for government at Google, said whether it’s the Air Force, the Postal Services or any other agency, the goal is not to move to the cloud, but to unleash their mission once they’re in the cloud.
“This move of API-first is really leaning into information flows versus repository mentalities. The way I think about it is that idea of catalyzing information, maybe even beyond controlling it,” she said. “I think we saw that for really the last 10 years on the commercial side, but I think now you’re really seeing public servants come out of the last three years and realize that they are going to have a higher rate of change than ever before. But more importantly, in that role of information steward, they’re going to have more access to more information than ever before, whether that’s inside or whether that’s outside the organization. So really this idea of how you use technology to be able to be curious, to be able to lean into those information flows.”
The use of APIs and other approaches to unlock the data is part of a move toward more creativity across the public sector.
Vint Cerf, the vice president and chief internet evangelist at Google, said agencies are discovering new ways to use data and make decisions because of what the cloud, APIs and other infrastructure technologies can provide.
“We encounter customers who have on-premise investments, including information that they want to keep there, so building systems that will allow interworking running loads on-premise and then running an expanded load, for example, in the cloud, or other ways of being helpful to our customers moving into a cloud environment,” Cerf said. “The idea that you’re stepping into a computing environment, which is very different from a closed world, is super important. At the same time, because you’re stepping into an environment which could be less closed then you’re accustomed to, we have other concerns like security and authenticity and protection of information and use of cryptography and the like, which need to be drawn into the architecture that is being used by our customers in order to literally weave a service that meets their needs, but also does so in a more expansive way.”
Learning objectives:
Complimentary registration
Please register using the form on this page or call (202) 895-5023.
Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.