Agencies need to apply much of the common sense learned from securing data on premise to securing it in the cloud. We get tips from a panel of experts from...
Here’s one way to think about your agency’s data in the cloud: like a guest at a hotel. Both the guest and the proprietor share the responsibility for safety and security.
At check-in, “you get a little key. Your room feels secure,” said Richard Breakiron, senior director of strategic initiatives for the Americas public sector at Commvault. “But in reality, if you lose your laptop, your phone, your clothes, that is still your responsibility. That’s part of the agreement.”
Speaking at Federal News Network’s Industry Exchange Cloud 2023, Breakiron said a cloud services provider doesn’t know the value of an agency’s particular data. “It’s very important that you have a really strong service level agreement (SLA) and understand that there’s a shared responsibility,” he said.
That shared responsibility, added David Rubal, head of U.S. federal business development for AWS storage at Amazon Web Services, “is basically where, in the case of AWS, we’re responsible for the infrastructure security itself, the maintenance of the system, storage, all the patching.”
The customer’s responsibility covers the security posture of the data and ensuring sound access management systems are in place for its users, Rubal said.
“Anything that the customer brings into the cloud environment, in terms of managing their own encryption, is a good example,” he said. “They bring that environment with their data.” He added that the AWS infrastructure can’t see the data or what it includes.
Breakiron said that agencies can use some of the same controls in the cloud that they use in their data centers. For example, he noted how some Defense Department agencies require people to use their common access cards to log into the cloud.
“But when you get into the cloud,” Breakiron said, “you also then find that the cloud provider or an OEM like Commvault bring a capability to manage the role and access.”
You can give users access based on certain attributes and, “you can control access with automation at scale that you probably don’t have when you’re on premises,” he said.
“Amazon does give you some tools that are inherent, cloud-native tools,” said Kevin Cronin, co-founder and president of Kelyn Technologies, a provider of data backup and recovery services. “And then there are some things that you need to bring yourself.”
Determining the correct security responsibilities requires carefully inventorying and mapping all data and tools across the organization, Cronin said. “You merge the two together, and you come up with a best-of-breed solution.”
Cloud data security has a geographical dimension, Rubal noted. AWS has standard commercial regions as well as AWS GovCloud and restricted access regions. Each has security capabilities “that are available to an agency to be able to customize to their needs.”
For each agency, this “creates an environment where not only their data is secure, based on how it’s being stored and used within the AWS environment, but also elevates their security posture in terms of how they’re managing that data in the cloud environment,” Rubal said
The panelists agreed that it’s important for agencies to ensure their SLAs reflect what they expect from each service they buy from a cloud provider’s catalog of offerings.
Rubal said that when purchasing storage, for example, an agency can choose the level of durability it needs — up to 11 nines or 99.999999% annual durability, he said. That means 0.000001% chance of loss.
“Agencies can make their services that ride on and are leveraged in the AWS infrastructure as capable as they as they need to be,” Rubal said.
In managing data stored in the cloud, Breakiron said classic strategies still apply to balance costs against the need for applications to access data quickly. AWS offers six storage tiers, from live via random access memory to what he labeled glacial, long-term, offline storage for data that an agency needs rarely or must retain for records management compliance.
Cronin noted that agencies also must set point-in-time recovery parameters in the event of ransomware attacks or other breaches that could compromise data. Such attacks, he said, often occur as a series of incidents over time.
“And then the sneaky guys get in and, at another point in time, attack some other piece of data,” Cronin said. “What you need is the ability to restore data at different times.”
Finally, the panelists emphasized the way in which metadata generated by cloud storage can help an organization manage data. Metadata, for example, indicates how often users access a specific dataset. In turn, a cloud services provider can apply machine learning to the metadata to automate moving old or little-used data to a less expensive storage tier.
“The key,” Rubal said, “is efficiency and effectiveness, but also understanding how that data is being used so that it can be managed appropriately.”
To discover more cloud tips and insights, visit our Industry Exchange Cloud 2024 event page.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Senior Director of Strategic Initiatives for The Americas Public Sector, Commvault
Head of U.S. Federal Business Development for AWS Storage, Amazon Web Services
Co-Founder and President, Kelyn Technologies
Host, Federal Drive, Federal News Network
Senior Director of Strategic Initiatives for The Americas Public Sector, Commvault
Head of U.S. Federal Business Development for AWS Storage, Amazon Web Services
Co-Founder and President, Kelyn Technologies
Host, Federal Drive, Federal News Network
Tom Temin has been the host of the Federal Drive since 2006 and has been reporting on technology markets for more than 30 years. Prior to joining Federal News Network, Tom was a long-serving editor-in-chief of Government Computer News and Washington Technology magazines. Tom also contributes a regular column on government information technology.