Next in Zero Trust – Visibility and stopping the spread

How are leaders at the Department of the Interior, NASA and DHS approaching zero trust?

Register
Speakers

5 speakers

Date & Time

On Demand

Webinar

Cybersecurity

The zero trust journey has proven to be just that: a journey. Federal agencies continue working to reach that cybersecurity state in which no network access event is presumed trustworthy.

Zero trust, though, requires more than simply hardening resources by challenging all calls to networks, application programming interfaces and databases. It must include the means to stop the advance of breaches that do make it through; that is, preventing “spread” of potential damage.

Zero trust also requires 100% visibility of everything on the organization’s infrastructure. Network maps often lack the necessary level of detail.

Above all, agencies now view zero trust as a state they must continually work to maintain. A crucial benefit they seek is avoidance of downtime or interruption of operations when something does occur.

“My firm belief is that zero trust is not an added-on new thing,” said Hemant Baidwan, the chief information security officer at the Department of Homeland Security. “It’s just how we need to do cybersecurity, period.”

He spoke on the most recent Federal News Network panel on progress in zero trust. Panelists emphasized the challenge posed by hybrid cloud-data center infrastructures that nearly all medium and large agencies have, and by the resource constraints under which they all operate.

Baidwan said the department’s most recent zero trust implementation guide takes into account the fact of hybrid infrastructure. He pointed out that, given the roster of large component agencies in DHS, he’s able to coordinate best practices for zero trust so they can diffuse across the department. This approach all occurs under what he called “a unified DHS cybersecurity maturity model that’s not only helping us prioritize our overall capabilities, but also helping us prioritize resources.”

Federated but cooperative

Dr. Mark Stanley, enterprise cybersecurity architect at NASA, said the agency looks at its enterprise information environment as three components: the corporate business and shared services network; mission-specific, purpose-built networks; and the Jet Propulsion Laboratory, a federally funded research and development center.

Zero trust development started with the enterprise net, where risks are lower.

“Eventually we’ll be taking lessons learned from our corporate experience over to the missions.” Stanley said. “Our number one constraint is not to break a mission.”

Plus, some of the mission networks date to the 1970s, with satellites launched long ago still sending back data. So they’re a challenge to integrate in terms of cyber monitoring.

NASA’s main zero trust challenges, Stanley said, concern data and identities.

“We have an extraordinarily complex data environment,” Stanley said. “We have data that runs the gamut from ITAR, or international trafficking and arms regulation data, all the way to data that, by statute, we have to share with all of humankind.”

“We have a very complex identity environment,” he added. “Many of the missions require uptime for their identity-providing capabilities of five nines, whereas on the corporate side, it’s only three nines. So we’re working towards developing a federated model for identity in which there will be a corporate identity provider that others will federate to.”

In all cases, he said, NASA strives to be “fully cognizant of who and what is on the network.”

Stanley Lowe, CISO at the Interior Department, described Interior’s infrastructure as “highly diverse and highly federated,” structured similarly to that of DHS.

He inherited a zero trust plan when taking the job a year ago. It came in handy earlier this year, when “we had a significant impact to the department’s operations because a particular vendor’s cybersecurity posture created a material risk for the department.” The result? “What we had to do was, okay, we have to turn all this stuff off. Now, what do we do?”

For one thing, the IT staff moved a test program with a secure access service edge vendor to production.

“We rolled it out to the entire department in two weeks,” Stanley said. “We’re in the throes of updating the zero trust roadmap.”

Interior is also creating a central enterprise security operations group “which will give us consistent visibility across the department.” He said that very visibility is complicated by the number of instances of cloud computing, and the ease by which development groups can spin up cloud instances.

Stanley said that earlier this year, Interior tested a product to survey all of the department’s data in its commercial clouds. The IT staff looked at “who was accessing it, what was accessing it, how it was being manipulated.” He added, “It was an amazing, eye-opening experience.” Equally eye-opening, he added, were the costs of moving cloud-hosted data.

Cloud smart = cloud challenging

Gary Barlet, public sector chief technology officer for Illumio, said agencies across the government deal with “this huge need and use of the cloud to meet mission needs, but also this understanding of the complexity of running in a hybrid environment.” He also noted the cost of cloud data movement, and how agencies must take it into account when evaluating their environments.

“The first key is visibility, and understanding how things are interrelated and how they’re touching one another,” Barlet added.

Even in highly federated situations, in some sense all networks interconnect, he said.

“It doesn’t matter if they’re on the other side of the world from each other. Because if they can talk to each other, that can be used to spread something,” Barlet said.

That’s why visibility, not only of individual assets, but also the way in which they interact, becomes crucial to building a full zero trust environment.

Obtaining full visibility “is the million dollar question,” Lowe said. He said that in Interior’s federated environment, each component maintains its own IT asset inventory. He said that at the department level, he’s trying to navigate a fine line. “Hoovering everything up” and conducting incident logging under Executive Order 14028 are expensive, even more so when they require moving lots of data to and from multiple clouds.

“I want to get enough of what we need to be able to make an accurate risk assessment of what’s going on,” Lowe said.

Stanley said the “smorgasbord” of logs, underscored by Office of Management and Budget Memorandum 2131, feed NASA’s security information and event management tools.

“The biggest challenge we’re running into, though, is our security operations analysts — how to make sense of that sea of information that’s just pouring over them every day,” Stanley said.

Especially as NASA moves towards a comprehensive view of its federated systems, it must couple visibility with artificial intelligence, he said.

“We can turn AI loose on those logs, on all that data, so that our analysts are presented with things they need to pay attention to, as opposed to just being inundated with information,” Stanley said.

“What worries me the most is what I don’t know about. Visibility, I feel is really critical,” added Baidwan.

He uses the DHS continuous diagnostics and mitigation (CDM) program to help with visibility at the system, network and enterprise levels.

Baidwan said the Cybersecurity and Infrastructure Security Agency “has been engaged with many partners to look at device types from an internet-of-things standpoint, and how it compares to traditional assets.”

IoT, he said, may not be visible to CDM, yet such devices are potentially sources of cybersecurity flaws.

Barlet said that after total visibility comes a kind of mapping activity such that interconnections not needed for a process or business function at a given moment be shut off, pursuant to zero trust. Agencies, he said, should develop “whitelist models to allow things to communicate, and pass information back and forth, that are absolutely necessary. And everything else should be denied by default.”

Whitelisting processes and access means “you start segmenting your enterprise in to small and smaller pieces,” Barlet said. “When you do that, if there’s a breach in one, it doesn’t necessarily mean it’s a breach at all.” The alternative is that when a breach occurs, “we’ve got to turn everything off.”

“The worst thing you can do is that self-denial-of-service,” he added. “None of us can afford to do that.”

Learning objectives:

  • Zero trust progress report 
  • Visibility in the threat landscape
  • The role of software interconnections
Speakers
Stanley Lowe
Stanley Lowe
Chief Information Security Officer
Department of the Interior
Mark Stanley NASA
Mark Stanley
Enterprise Cybersecurity Architect and Zero Trust Lead
NASA
Hemant Baidwan DHS
Hemant Baidwan
Chief Information Security Officer
Department of Homeland Security
Gary Barlet
Public Sector Chief Technology Officer
Illumio
Tom Temin
Tom Temin
Host, Federal Drive
Federal News Network
Sponsors

Please register using the form on this page.
Have questions or need help? Visit our Q&A page for answers to common questions or to reach a member of our team.