How data resiliency, zero trust will underpin the future of federal cyber

John DeSimone, the vice president for Cybersecurity and Special Missions at Raytheon, said agencies have to ask themselves, can their data—not just their...

Shape

The Importance of Data Resiliency

Orchestration is not a new approach, but what you would need to have is an architecture in your environment to provide zero trust so you understand at each step the integrity of the data, whether it’s being run on the appropriate servers or whether it’s being run by the appropriate people so you need to have bindings of users to equipment and equipment to data.

Shape

Moving Toward Zero Trust

As you're rolling out a zero trust approach and an architecture that aligns with that, you can see yourself moving more toward the data side or the enterprise side. You still will need end-point protection, but not to the degree that you have today because you are comfortable with the risk profile at the back end.

Data is the lifeblood of any organization.

Research firm IDC predicts the world’s data will grow to 175 zettabytes by 2025 from 33 zettabytes in 2018. That’s a compounded annual growth rate of 61 percent. Just to put some perspective around this number a zettabyte is a trillion gigabytes so 175 zettabytes is 175 million gigabytes.

The expectation of data growth across the federal government is no different. There is more than 247,000 data sets on data.gov. That’s just one measure.

There are zettabytes of data from nearly every agency from the Census Bureau to NOAA to the Agriculture Department is growing every day, especially as the use of Internet of things devices continue to grow.

The dependency on the data to run missions, to serve citizens and to protect the nation depends on security and resiliency.

There’s a lot of discussion about security, but resiliency is a topic that probably doesn’t get enough attention.

Many times when agencies talk resiliency, they are thinking continuity of operations (COOP).

For example, the Office of Management and Budget’s June 2019 data center memo talks about the resiliency of data centers and ensuring agencies know about problems with cooling or power.

But data resiliency for this discussion means something much different. Agencies need to consider an enterprise data management approach that has a command and control approach of sorts. And as agencies move into a hybrid IT environment where some applications and data are in the cloud and some remain on premise, the concept of always having access to your data becomes more important.

John DeSimone, the vice president for Cybersecurity and Special Missions at Raytheon, said agencies have to ask themselves, can their data—not just their systems—survive an attack?

He said data resiliency incorporates continuity of operations and disaster recovery, but it’s more than that.

“It also is a shift toward monitoring, doing the orchestration, understanding the collection, the creation and the classification of our data and being able to watch the flow through your organization,” DeSimone said on the program Every Side of Cyber: Data Protection with Zero Trust. “Much like today were security systems tend to watch network traffic and network flow, we need to move to an environment where we are watching the data flow and understand the impacts that different systems and users have on that data.”

DeSimone said there is a growing understanding across the government about why data resiliency is important, but agencies are inconsistent in applying the tools and techniques.

This concept of taking a more active role in understanding, managing and ensure your data is more than just secure, but resilient too becomes more important as agencies continue to live in a hybrid IT environment where some systems and data will be in the cloud and others will be on-premise.

DeSimone said the cloud makes data resiliency easier in some respects, but it comes down to the need for orchestration, or in military terms, agencies need a command and control perspective.

“Orchestration is the key. Having the tools and being able to encrypt obviously are going to be critical. Being able to classify is going to be critical. But tying data together in a command and control environment and orchestrating it across the enterprise is really the next step,” he said. “Orchestration is not a new approach, but what you would need to have is an architecture in your environment to provide zero trust so you understand at each step the integrity of the data, whether it’s being run on the appropriate servers or whether it’s being run by the appropriate people so you need to have bindings of users to equipment and equipment to data.”

DeSimone said once agencies have a fuller understanding of their data, then they can put tools in place to visualize the information to make better decisions.

Data resiliency is another piece to the zero trust puzzle agencies have to figure out and in to the framework.

DeSimone said agencies should add verification of the data and user at each step as part of creating a resilient and zero trust environment.

“You will use a set of dynamic variables to identify the trust, the user, the time, the work patterns, the type of information they are looking to correlate to say, ‘should this action be taken and is it valid?’ If it is, let it pass, and if not, you can stop it right there,” he said. “You get a higher degree of an ability to protect yourself. That goes back to being able to collect the information from the sensors that are in place today and add an orchestration layer around the data to take use of those, monitor and apply it to how data flows.”

DeSimone said the zero trust approach will help agencies stop “chasing the endpoints,” which is an unwinnable battle, especially with the Internet of Things continuing to grow.

“As you’re rolling out a zero trust approach and an architecture that aligns with that, you can see yourself moving more toward the data side or the enterprise side,” he said. “You still will need end-point protection, but not to the degree that you have today because you are comfortable with the risk profile at the back end.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories