Federal agencies might be under a mandate to bring zero trust architectures to their systems, but they need to have a great deal of trust in their commercial cloud providers too.
The two ideas don’t contradict one another. Here’s how Richard Breakiron, senior director for strategic initiatives for Commvault’s Americas public sector, explained it, using the analogy of a hotel: Guests expect a high level of security and service reliability from a hotel, but that doesn’t extend to cash and valuables left in rooms. That’s on each guest.
That means customers should trust but verify, said Sean Phuphanich, senior solutions architect Amazon Web Services, which partners with Commvault. “Trust in the cloud is not unlike normal relationships. You have to earn trust. It’s not something that’s just given,” Phuphanich said.
Cloud service providers can show several indicators of trustworthiness, including growing numbers of customers and rising levels of clearance certifications. Plus, a CSP should specifically have a shared security responsibility model with clients, he advised.
“We make it very clear what we take care of for your security of the cloud,” Phuphanich said. “Whereas the customer is responsible for security in the cloud.” That establishes boundaries, and it lets agencies put their attention toward a discrete set of security measures, he said.
“Customers are able to allocate their teams and resources to focus on a narrower set of issues, and be more effective at it,” Phuphanich said.
One “in the cloud” resource agencies are responsible for protecting is data. Breakiron said the challenge centers on “how you manage data from the endpoint all the way back to when it might need to be recovered.”
Multicomponent agencies seek solutions that can help them manage on a granular level since often they must have visibility into multitenancy environments, he said. Realizing greater business or mission value from data should underlie the choice of specific technologies that an agency selects for managing and securing its data, Breakiron said.
“When you start doing movement from your own technology on premises into a cloud environment, you’ve got to understand, ‘How do I get the data there? How is the data going to be protected? What is the business process associated with it? And who are my partners to help me?’ ” he said.
Cloud Smart reflects need to shift from CapEx to OpEx
CSPs, including AWS, have made significant investments in cybersecurity to get beyond the minimum levels of the Federal Risk and Authorization Management Program (FedRAMP). Now, cloud hosting exists at Impact Level 5 and above.
For AWS, a 2023 initiative aims to support the idea of being cloud smart and connecting government with best practices for managing cloud services from the private sector, Phuphanich said.
“Relative to the initial federal Cloud First policy, Cloud Smart is really a maturity of the adoption of cloud for agencies,” he said. Early adopters didn’t always optimize how they used cloud, “so they didn’t necessarily get all the benefits that they had originally anticipated,” he added.
Phuphanich recommended that federal IT staffs shift away from a capital expenditure model, in which networks and data center hardware undergo equipment change-outs at set intervals, to an operational expenditure model.
“Cloud is this kind of a continual process, where you can continue to optimize even potentially on a daily basis,” he said. “And you’ll see those benefits in effectively real time.”
Breakiron cited as an example an application that only needs to run 12 hours a day, five days a week.data A CSP can manage spin-ups and shut-downs according to an agency’s prescribed schedule. If the agency knows what services and capabilities the CSP can offer, it can avoid costs by not having an application spooled up 24/7.
“Cloud smart means you’ve done the research,” Breakiron said. “You know your business processes well enough, and you know how the cloud works.”
Check out all the sessions from Federal News Network’s Industry Exchange data on our event page.