This discussion with Andrew Fairbanks, the general manager for federal at IBM Services, is part of Federal News Network’s Cloud Exchange.
There are few agencies, or really any organizations, that are putting all their proverbial technology eggs in one basket. Most agencies are living in this hybrid world of hosting data and applications in the cloud and on-premise.
But just because agencies are in a multi-cloud environment doesn’t mean they can’t optimize this hybrid architecture and manage it more effectively and efficiently.
Andrew Fairbanks, the general manager for federal at IBM Services, said through a hybrid cloud approach agencies need to break down the siloes between data and applications that tend to exist.
“What they’re getting to now is that they really need to be able to run those workloads across the boundaries. They need to be able to have applications that run in a dedicated data center, consuming data from the public cloud, being able to deliver data out to the tactical edge,” Fairbanks said on Federal News Network’s Cloud Exchange. “In order to do that, they need a common management system so that they have visibility into how those workloads are running across that distributed enterprise.”
That ability to have a common management system—a single pane of glass if you will—has been difficult for agencies since cloud services and platforms became available more than a decade ago.
But Fairbanks said agency chief information officers are quickly coming around to the importance of a cloud management system. He pointed to how the Defense Department’s Joint Warfighter Cloud Capability (JWCC) isn’t just asking for a multi-cloud approach, but also includes the need for an overall cloud management system to orchestrate workloads within the hybrid environment.
“When it comes to what we mean by orchestration I think the metaphor that works for me is an orchestra and the agency’s CIO or the CTO is the conductor. You have a pit out there with people wearing Google sweaters, Amazon sweaters, Microsoft sweaters, IBM sweaters and Salesforce sweaters. The real key is how do you create that score? How do you grab the baton and create that cadence so that all those different pieces can really work effectively and play a symphony that really is in concert, and not out of tune?” Fairbanks said. “There are a number of tools and techniques that go into doing that. I think one of the ones that’s really emerging is the container technology, for example. Open source containers really provide a mechanism for CIOs and CTOs to take applications and run them anywhere. It gives them incredible flexibility to surge to take advantage of what the best technology is for a given context so that they get those workloads done. It also allows them to create an overarching security framework that really is needed in this kind of world where things are crossing boundaries, and you need the right encryption for data in motion and encryption for data at rest. It provides that overall governance of how workloads flow, and it allows them to minimize how data moves.”
Open containers provide agencies with two major benefits.
Fairbanks said the first is open containers help agencies avoid vendor lock-in.
“I think the other reason is different environments really are purpose built for different kinds of workloads. Mainframes are generally regarded as the best environment for running high volume transactions and really data intensive workloads, while high performance computing is really important for workloads that require enormous amounts of calculations in a very short period of time. Distributed clouds are the best and most cost effective way of delivering more citizen facing kinds of transactions,” he said. “They want the flexibility to be able to put the workload in the environment that’s best tailored for that need. And open source containers allows them to take advantage of that richness in a way that I think helps them do their jobs more effectively.”
Containers also offer agencies a way to cross the next bridge to take more advantage of cloud services. Fairbanks said through the containers and other tools like common application programming interfaces (APIs), the assorted cloud environments become more interoperable, which lets users run workloads in one cloud, but burst into another when mission requirements call for an accelerated increase in capacity.
“I think the biggest challenge I hear them talk about is that their attack surface is so large, and there’s so much data, so how can they make sense of that data to be able to detect and resolve a threat before it occurs?” Fairbanks said. “This is where I think there’s some really interesting and exciting work going on in the application of artificial intelligence and machine learning in terms of the ingestion of all that data, the monitoring of that data in real time to help separate the wheat from the chaff to provide the data to the security operations center in real time.”
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED