Now that turning back time is no longer a viable option, agencies need to move a data protection model that’s built around a single source of truth, recommend...
For decades, IT departments have set their data backup strategies around metrics for recovery point objective and recovery time objective: RPO and RTO. The idea being that you could roll back a system to just prior to an interruption and restart normally with minimal data lost and disruption.
That approach, of course, presupposes an acceptable level of data or transaction loss over a prescribed time period.
But in the age of ransomware and sophisticated phishing for launching attacks — as well as distributed data across hybrid environments — entire databases are at risk, said Aaron Lewis, vice president of U.S. sales and engineering at Rubrik. Therefore, while RPO and RTO remain important specifications for designing backup systems, it’s time to expand on them by taking a more comprehensive view of data protection.
“Traditionally, organizations really looked at two things,” Lewis said during Federal News Network’s Industry Exchange Cyber. “How much data am I willing to lose? That’s RPO. And how long can I wait until I get my data back? The recovery time objective. That just doesn’t encompass the whole picture.”
Not in an age when cyber espionage and wholesale theft of data are the primary objectives of malicious actors, he said. IT and cyber teams must ask themselves whether they can recover 100% of their organizations’ data and, once having recovered it, ensure the data’s integrity.
This capability starts with what Lewis called data observability, having a complete picture of the data that needs to be protected. Observability augments the traditional cybersecurity practice of “let’s protect the moat.” Without observability, Lewis said, IT teams often find themselves rolling systems further and further back in time, seeking a safe spot at which to restart. Suddenly, a 10-minute RTO ends up being hours, even days, he said.
To get to total data protection, Lewis recommends that IT and cyber staffs do a couple of things:
A single, shared platform helps ensure proficiency for the people interacting with it because they no longer must deal with multiple tools, Lewis said.
“What’s important is a solution built from the beginning, with data security in mind, and an immutable data storage platform with observability built into the system,” Lewis said.
What’s more, the datacentric protection application — the single platform — is best acquired as a cloud-hosted, software as a service product, he added.
SaaS applications “have the ability to reach in not only to physical, on-premises applications and workloads but also manage cloud workloads as well by their very nature of also being in the cloud,” Lewis said.
A single platform solution must include automation for adding new workloads and data sets too. Lewis recalled an incident, when as a junior IT backup and recovery administrator, he overlooked a server that became corrupted.
“I realized I had not added it manually to the protection policy,” he recalled. Today, administrators need to designate workloads for protection by type so that their security management automatically aligns to the organization’s protection policies. “As new workloads get added, I think that’s absolutely crucial,” Lewis said.
Learn about more security best practices and technologies shared during Federal News Network’s Industry Exchange Cyber.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Vice President of U.S. Public Sector Sales and Engineering, Rubrik
Host, The Federal Drive, Federal News Network
Vice President of U.S. Public Sector Sales and Engineering, Rubrik
Aaron Lewis is the VP of Sales Engineering for U.S. Public Sector at Rubrik. For most of 20+ years in technology, he has been focused on innovative datacenter technologies, most recently data protection solutions. He brings unique experience as a former IT Director to helping his customers solve the challenging data security problems of government agencies.
Host, The Federal Drive, Federal News Network
Tom Temin has been the host of the Federal Drive since 2006 and has been reporting on technology markets for more than 30 years. Prior to joining Federal News Network, Tom was a long-serving editor-in-chief of Government Computer News and Washington Technology magazines. Tom also contributes a regular column on government information technology.