Strategies for the next generation of cloud applications

Cloud security and scalability is an important topic in government today. But what lessons have agencies learned to create more impactful optimization for the future?

Register
Speakers

5 speakers

Date & Time

May 18, 2022 2:00 p.m. ET

Webinar

Date: On demand
Duration: 1 hour
Cost: 
No Fee

In the never-ending work of information technology modernization, federal agencies have been working on three tracks when it comes to cloud computing. They’ve moved at least some legacy applications to the cloud in what’s commonly called lift-and-shift. They’ve adopted commercial software-as-a-service offerings for applications like email and productivity suites. And they’ve developed their own cloud-native applications, typically using the DevSecOps approach.

What comes next? That was the topic of a panel discussion of federal IT practitioners hosted by Federal News Network and Appian. Panelists agreed, even custom-developed applications age and become legacy. That’s the case for Ravyn Manuel, senior application developer, architect, and DevOps engineer at the National Museum of African American History and Culture.

The museum opened five years ago. Referring to interactive customer experience applications developed then, Manuel said, “Our stuff is becoming legacy. So we have to figure out how to deal with legacy.” A key concern is for updated versions is cybersecurity, she added, because they’ll be commercially cloud hosted, rather than on internal servers. And they’ll be usable via visitors’ mobile devices, which brings an additional potential theat.

A modernization trend noted by Ray Wulff, the industry lead for global defense and intel programs at Appian, concerns the integration of applications to create new services. This occurs, Wulff said, using what he called an “agility layer” that lets developers “tap the new systems, the new applications and the legacy systems at the same time.” Such integration extends to the data connected with various applications, and also to the required cybersecurity and compliance controls, he added.

Updating and integrating

Wulff said agencies take a variety of approaches to legacy applications besides simply running them in a cloud-hosted mainframe emulator. They may refactor Cobol code, say into Java, or they might use a low-code logic extractor such as offered by Appian. In all cases, he said, IT staffs must “figure out, okay, what are the storage and security concerns in the cloud with a refactored application?”

Such work offers a chance for agencies to exchange best practices, rather than learning the same ground separately.

Steven Hernandez, the chief information security officer at the Education Department, said, “Shared services is driving just an incredible opportunity, both from say, a cybersecurity and security services consumption perspective, but also that user experience.” He added, “When we’re thinking about our cloud applications and our workloads in the cloud, a big part of that conversation is, where are those shared service sweet spots that I ought to be consuming? Not just because it’s fast, it’s already stood up, the pricing is good. But also because it’s going to drive a better citizen experience.”

A source for shared services is the cloud.gov program office within the Technology Transformation Service at the General Services Administration. One example, said Bret Mogilefsky, an information technology specialist with cloud.gov, is api.data.gov, “a service if you’re looking to secure and hand out keys for an application programming interface.”

API security is a concern at the museum, Manuel said. She cited a project to create an online, searchable exhibit concerning slavery and freedom that can display items drawn from siloed systems housing images of the collections of three other museums, some hosted on premise by the Smithsonian’s office of the CIO.

“I am doing things right now with API’s. Our legacy systems are at OCIO, and I have to work with them. The security piece is very big for them,” Manuel said.

Panelists agreed the lift-and-shift era is over. Mogilefsky said that while a bulk cloud move certainly helped energy consumption and security, “it doesn’t help us with the agility of really being able to do new things in new ways. And also to collaborate between agency silos,” he said. He advised to shoot “high in the stack” with services such as container orchestration to ease what he called the “bespoke nightmares” of earlier systems integrations.

Whether updating applications or combining components into new applications, Wulff said a number of Defense agencies are turning to the low-code approach. Security and speed of deployments are big reasons.

“There’s a reason why you’re seeing such an explosion in low code platforms,” Wulff said, “because the platform itself to develop the applications is getting the ATO (authority to operate). So then you really don’t have to go through the ATO process.”

Learning objectives:

  • Updated List of Cloud Goals
  • What to Ask Your Cloud Provider
  • The Approach to Launching New Platforms in the Cloud

This program is sponsored by   

Speakers
Steven Hernandez
Chief Information Security Officer, Department of Education
Ravyn Emanuel
Senior Application Developer, Architect and Dev Ops Enginner, National Museum of African American History and Culture
Brett Mogilefsky
Information Technology Specialist, cloud.gov, General Services Administration
Ray Wulff
Industry Lead, Global Defense and Intel Programs, Appian
Tom Temin
Host, The Federal Drive, Federal News Network

Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.