Agencies need to understand their data and current legacy system architecture before moving applications to the cloud, advises Splunk cloud solutions architect.
As the journey to the cloud continues, agencies are learning they must have two key ingredients nailed down: One, a roadmap explaining their current and future architectures. Two, an understanding of their data and where it lives.
Over the course of the last decade, it has become clear that successful cloud migrations grew out of these two concepts, said Jon “JG” Gines, a cloud solutions architect at Splunk.
“You have to understand which systems could go to the cloud and the ones that don’t really need to — and have a strong justification for that. Are these snowflake systems or not? Is there a roadmap for that? And then speaking of roadmaps, if you’re going to the cloud, saying it is one thing, getting there is another challenge. You have to make sure you have a very clear roadmap to go to the cloud,” Gines said during Federal News Network’s Cloud Exchange 2024.
Many times agencies miss that alignment as outlined in the roadmap and that causes them to either slow their IT transformation efforts or spend more money than necessary for cloud services, he said.
Gines said agency planning must account for both legacy systems and current data architectures to determine what can and should go to the cloud, and then what should stay on premise.
Those so-called snowflake systems that an agency has customized over the years and would require more time and effort than it’s worth to move to the cloud still need to be part of that roadmap, Gines said.
“Many customers know their architectures pretty well. Even the managers that are not very technical, they at least conceptually understand their architectures. It’s when they start going to the cloud that everything becomes somewhat more abstracted,” he said.
“Concepts such as serverless, load balancing, autoscaling and similar are all terms that are more cloud speak as opposed to terms that you find in an on-premise environment. So oftentimes, managers or decision-makers who traditionally understand on-premise architectures have trouble translating them into the cloud because it’s a lot different.”
Add to that the different cloud flavors — like infrastructure, platform and software as a service — and an organization’s architecture becomes even more of an abstraction, Gines said.
This is why breaking down the complexity of an agency’s systems starts with understanding the data, both its sources and its importance to the mission.
Remember, he said, not all the data at is of the same operational value.
“Some data is very specific to certain domains, like security observability, and then some data is very specific to actually running missions,” Gines said. “We have one customer that I actually had a call with, and they’ve got a lot of data. I asked him specifically about the data sources that are very specific to containers, Kubernetes, and then he also had other systems that they were just monitoring on premise. We learned that they were monitoring different environments in silos.”
Gines said creating a “single pane of glass” to monitor and understand an organization’s data, whether in the cloud or on premise, will improve the understanding of the data’s operational value.
“I’m talking about not just taking in-the-raw data but enriching the data,” he said. “We have a lot of third-party integrations at Splunk. We have third-party integrations from AWS, Microsoft and all these big major vendors. It’s not just getting the data, but it’s also enriching the data so that it actually makes sense to the customer and to make sure it’s accurate.”
When agencies have confidence in their data, then the use cases for AI and machine learning become clearer and easier to implement.
“If we’re talking about artificial intelligence and machine learning, I would recommend that government agencies have a very specific set of use cases so they can also have responsible AI that is specifically aligned with the agency’s mission,” he said.
Gines said applying AI/ML tools in an automated fashion can speed up an organization’s time to investigate or do some rote review or examination more quickly.
“Essentially, what’s happening is through one single application, you’re actually seeing what’s inside of your cloud environment, your on-premise environment, and if you have SaaS environments, you are also seeing the data that comes into there. It all can also be monitored,” he said. “I do find only those agencies that are actually using a lot of data and using security information and event management tools to improve their observability, that’s where I see a lot of the sophistication happening as opposed to agencies where they’re just starting to get to the cloud.”
Discover more articles and videos now on Federal News Network’s Cloud Exchange event page.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED