Agencies are migrating data to the cloud as a fundamental part of their IT modernization journey. But what about preparing for when disaster strikes?
Just as on premise, backup and disaster recovery must be a fundamental element of every cloud migration, said Sam O’Daniel, vice president of civilian sales at TVAR Solutions.
Agencies need backup solutions that ensure their data can be recovered in an emergency and that the backup data is protected from malicious tampering, O’Daniel said during a panel discussion of the Federal News Network Cloud Exchange 2023.
Agencies, he added, keep producing large quantities of data, which puts a strain on their networks and the bandwidth that they have to get to both on-premise environments and to the cloud.
“When it comes to the new age of digital transformation and workloads, what we’ve seen is a lot of agencies are struggling with the ability to move large quantities of data through network pipes as well as being able to protect those modern workloads,” O’Daniel said.
It’s why it has become so important that, as agencies develop their hybrid cloud strategies, they make decisions about the backup solution that best meets their specific requirements, said Joye Purser, global lead for field cybersecurity for Veritas Technologies.
“When it comes to backing up data, you can do a full backup, which is the complete copy of all of the data, or you can do an incremental backup, in which you backup some of the data, but not all of it,” Purser said.
The full backup process can be time-intensive and require a lot of compute power. “That’s why agencies need to understand what data should be backed up versus archived, and they also should consider retention policies based upon regulations such as patient data, financial data or other sensitive information. Such details will drive backup and retention decisions,” she advised.
“It takes a lot more time to develop a process and effectively back up and restore complex or older datasets, especially for large agencies such as the Department of Defense, with extensive military data,” Purser continued. “Even smaller agencies like the National Science Foundation have extensive research data that must be protected.”
Make use of tools to simplify backup and recovery
Agencies have tools available to simplify and streamline their backup and recovery work.
Deduplication tools look for data that has changed or has not changed since the last full backup or incremental backup, for example.
“Deduplication enables the backup system to not have to back up things that hadn’t been changed, basically. And so it really decreases the time and allows for a much more rapid backup process,” Purser said.
Data compression is another way to back up data more quickly.
When paired, compression and deduplication can assist with the transition to the cloud by reducing both the time for backups as well as overall network bandwidth costs that agencies experience when moving to cloud environments, O’Daniel said.
A backup and recovery strategy is also essential for staying ahead of emerging threats, including ransomware, he added. “Bad actors are continuously trying to find ways to compromise datasets, whether it be through phishing attacks or other ways to access government systems.”
Agencies, he added, further need to plan for contingencies as part of full disaster recovery, such as making sure data is secure and available at a secondary, offsite location.
“A good ransomware strategy is very important and ensuring that you have a platform that enables data immutability and the ability for data not only to be backed up, but to understand whether or not it’s been tampered with — and also ensure that once it is backed up, that it cannot be tampered with,” O’Daniel said.
Develop backup, DR strategy based on agency’s unique data uses
Purser, a former member of the Senior Executive Service at the Cybersecurity and Infrastructure Security Agency, pointed out that agencies must make developing consistent backup plans part of their cybersecurity due diligence.
“I consistently advised critical infrastructure owners that they need to be doing a tabletop exercise to understand what they would do in a crisis, and that should include restoring their data from backups,” she said.
“They need to test that backup system and process, so that when things go down — especially if they’re an entity that is responsible for critical infrastructure, such as banking or health care — they must assess the backup systems to ensure that the operating technology and other critical systems are able to be brought back online,” she added.
Agencies also need to understand the specific datasets they want to backup and any data retention policies they need to follow.
“Whenever we think about data retention, protection, backup and restoration, you have to consider the people, the processes and the technologies that are in place in order to achieve those goals,” Purser said
What’s more, agencies need to thoroughly categorize their data based on its specific components and restrictions, she said. For example, Purser noted personally identifiable information and classified information as categories of data with unique restrictions that agencies must accommodate in their backup and disaster recovery strategies.
“Those are all questions that software and IT professionals need to consider when backing up data,” Purser said. “There’s no way around the due diligence — and due care — required by the owners and managers of the data to determine what is the data that must be preserved or backed up. Lives may depend on it.”