Insight by Veritas

Effective data protection requires data visibility, agency resiliency and a ransomware playbook approach

Agencies must prioritize data visibility, agency resiliency and a ransomware playbook approach to ensure effective data protection.


Cybersecurity Best Practices in the Age of Ransomware

Agencies need to have a three-to-one backup strategy. Back the reliable backup protected backup. Continuous backup is so important. I mean this is the lifeline of the data itself. Reliable backup, protected backup, continuous backup is so important. The data is the lifeline of the agency.


Ensuring Ransomware Resiliency

Before starting down the road of protecting the data, and trying to analyze it, you really need to have data visibility. That is really the first step here. And it's where many of the agencies go wrong.


Best Practices to Combat Threat of Ransomware

I think everybody is at risk. And everybody is at risk of ransomware as one of the cyber attacks. We've started looking at data protection and recovery and resiliency as a key strategy to be able to actually come back.

Dealing with ransomware requires a playbook all of its own

You may have known that ransomware is on the rise, but maybe not the extent of it. According to Prem Jadhwani, the chief technology officer at Government Acquisitions Inc., the phenomenon – of hackers holding an organization’s data hostage in return for money – started in earnest several years ago. Reported incidents, he said, rose more than 150 percent just in the first half of 2021. Some indicators show ransomware attacks occurring every 11 seconds.

Jadhwani said to deal in an organized way with the ransomware threat, agencies need a playbook approach, an organized set of procedures that involve both the IT staff and frontline employees.

At a minimum, Jadhwani said, IT must have what he called a three-to-one data backup strategy. That means three copies. One is kept offsite and unconnected, not in the cloud. Two may be kept online but in physically separate media.

Second, Jadhwani said, maintain knowledge of your agency’s data assets, which are most critical, and establish clear recovery time objectives (RTO) and recovery point objectives (RPO) for each asset. These objectives are tied to how the data is used, how critical they are to a given application.

Third, don’t overlook the basic cyber hygiene step of keeping operating systems and applications up to date with the latest updates and security patches as soon as vendors publish them.

Effective data protection starts with complete data visibility

The federal government not only amasses large quantities of data, it also has data, information, and information sharing among its most important products. The constant flow of data products from places such as the Census Bureau or the Agriculture Department attest to this.

That’s one reason why agencies must have total visibility into the range of data they possess. And why, once they establish that visibility, they’ve got to ensure the security of data. Kurt Steege, the chief technology officer of ThunderCat Technology, said the growing ransomware threat coupled with the sensitivity of much federal data makes protection all the more imperative.

In one recent instance, Steege said, a Veritas-conducted structured search for an agency’s data found that 58 percent of its data was “dark” – basically invisible to officials. The data hadn’t been accessed in three years. That nevertheless represented a significant cyber vulnerability. A third of the data in the agency’s Active Directory was redundant, obsolete or trivial. At the least, that represents needless cost. Only 11 percent was in active use.

It all points to the need, Steege said, for comprehensive data management strategies and a refresh of the tiered storage approach. That starts with discovery and full visibility.

After you’ve identified data, the next step is dividing data sets according to criticality and whether they’re needed for production, kept “warm” nearby, or are candidates for long term storage in a facility such as Amazon Glacier.

Particularly for cyber protection purposes, Steege said, agencies need an encryption program that covers data in storage and in motion on the network. Crucial to encryption is management of keys themselves in a logical, automated manner. This will become more important as encryption technology moves to its final frontier. Namely, the encryption of data in use by an application. A solid key management program, he said, incorporates efficient key generation and centralized management.

Why the ransomware threat calls for agency resiliency

Organizations have basically two ways to deal with ransomware attacks. They can pay to have their data decrypted or otherwise release. At an average of $2.4 million per attack and rising, that’s becoming an ever less palatable option. Or they can act ahead of time to devise a tangible strategy they can test and rehearse. And, according to Mike Malaret, director of sales engineering for Veritas Federal, if that strategy brings real resiliency to IT, organizations including federal agencies can not merely recover after an uncomfortably long period but, rather, barely miss a beat.

Malaret described the challenge as having the capability of data protection, threat detection, and fast recovery. He said Veritas is able to speed the detection of anomalies in the data backup ingest process by using artificial intelligence. Its process can “see” problems when the images are at rest or during a restore cycle.

Having a known, uncorrupted image on file in the cloud lets the IT staff restore operations on fresh infrastructure, if need be. Malaret said that by using continuous backup devoid of anomalies, an up-to-date and trustworthy image will be available for recovery. He added that Veritas has been investing in how to create immutable system and data images for cloud or the company’s own facilities. The service uses a version of SC Linux as its operating system. Malaret said the OS, because of its everything-off-by-default status, contributes to the zero trust environment in which the immutable images are held. He added, containerizing workloads and data sets furthers cyber protection when the backup and recovery system employs certificates and authentications for the containers.

Malaret reiterated the need for maintaining three copies of data, including one offline. That, coupled with a capable backup and recovery engine and a strategy that IT officials rehearse regularly form the basis for assurance against ransomware.

Listen to the full show: 

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

Featured speakers

  • Prem Jadhwani

    Chief Technology Officer, Government Acquisitions Inc.

  • Kurt Steege

    Chief Technology Officer, ThunderCat Technology

  • Mike Malaret

    Director, Sales Engineering, Public Sector, Veritas

  • Tom Temin

    Host, The Federal Drive, Federal News Network