Don’t start playing a dirge for the 16-year old cybersecurity program known as EINSTEIN. But with the release of the draft Trusted Internet Connections 3.0 implementation guidance, industry experts agree, the end is near for the long-time and sometimes questionable-value intrusion detection, intrusion prevent program.
“Today’s concept of EINSTEIN is going away. It kind of has to happen,” said Stephen Kovac, vice president of global government and corporate compliance at Zscaler, said in an interview. “TIC 3.0 isn’t here to kill EINSTEIN, but decouple it from TIC. I think some form of EINSTEIN will still need to exist. Agencies and DHS still need to collect telemetry data.”
Kovac said what many federal chief information security officers and chief information officers have said over the years, “EINSTEIN today is not providing very useful data.”
Kovac said Zscaler collects 93 data fields through its sensors, while EINSTEIN is focused mainly on netflow data and blocking known threats and signatures.
Susie Adams, the chief technology officer for federal at Microsoft, said the draft guidance from DHS makes it clear that EINSTEIN’s shelf life is limited, even if it doesn’t specifically say that.
“The existing TIC architecture that’s in place is to protect agency networks as they were developed over the last 20 years,” Adams said in an interview. “But for the cloud, it looks like they are trying to go to the right place by storing data in the cloud and using machine learning or advanced analytics to understand what’s going on. This is why the traditional EINSTEIN will only exist for agency traffic coming out of their own network. I think DHS is trying to evolve EINSTEIN as well.”
The need to update TIC, and thus move away from EINSTEIN, became clear as agencies suffered from latency and other delays when integrating cloud services with these security tools and architectures.
Kovac, Adams and other federal cyber experts said DHS’s five draft guidances to implement TIC 3.0 are well thought out and well-constructed, giving agencies a less prescriptive and more flexible approach to securing data and using the cloud. Comments on the draft documents are due Jan. 31.
The guidance follows the updated memo the Office of Management and Budget released in September.
“Through the new guidance, agencies now can understand what risks they are trying to mitigate, what services they are trying to use and then the steps for how they can do it,” said Josh Moses, a former chief of the cyber and national security branch in the office of the Federal CIO. “I do think this makes moving to the cloud easier. The reference architecture shows that there now are many roads that lead to Rome versus the one or two ways under the previous TIC architectures. This new TIC architecture is much more flexible in the way agencies can access the internet as well as from a security and cost perspective. It frees up agencies to make better risk informed decisions.”
That has been the goal of many of OMB’s updated policies. Experts say the decision by federal leaders to have an “assume breach” mentality instead of a “protect everything” approach is clear in the TIC documents.
Cloud bottleneck should be gone
DHS isn’t so much telling agencies what to do, but more what outcomes agencies should aim to accomplish.
“Detection is the most important piece of this. If you assume you’ve been breached, then you need to spend time on detection and automating that detection, and this new TIC documents are a step in the right direction. It’s part of the zero trust framework,” Adams said. “The bad thing about not being prescriptive like TIC 2.0 was is it leaves a lot of things for agencies to decide and that could cause things to slow down because there may not be agreement on security control implementation and risk posture for the data and where it’s stored. We are hoping that being more subjective in how you meet TIC will provide more leeway for agencies and not inhibit cloud adoption.”
Adams said the new TIC approach removes the choke point that was EINSTEIN and the managed trusted internet protocol services (MTIPS).
“Agencies can now can define their own path to secure their internet connections, and that is huge,” she said. “It gets rid of the bottleneck.”
Kovac said the move away from MTIPS may be difficult for some agencies, and especially the telecommunications providers, because they have used it for so long and are comfortable with the security services. He estimates agencies spend about $1 billion a year on MTIPS.
Ross Nodurft, another former chief of OMB’s cyber branch and now a senior director for cybersecurity services at Venable, praised OMB and DHS’s work on TIC 3.0, but said the one thing that is missing is an incentive to move to the new architectures.
“The documents make assumptions that agencies already are motivated to adopt these new technologies and want to move to a new architecture, but what is motivating them to adopt these new tools? What is driving force?” he said. “The TIC memo rescinded the other TIC requirements and gives agencies the ability to build out a TIC architecture with more of a risk-based view. But why make the change unless you have a reason to? What are the drivers for agencies who are using MTIPS or another approach and not having any problems? I would like to see a more active solicitation of pilots to show why moving to 3.0 is worthwhile.”
Still need to connect programmatic dots
In the draft documents, DHS highlights two use cases, but also tells agencies how to develop and submit plans for additional proofs of concept.
Nodurft said he’d like to see vendors take a more aggressive role in developing use cases, which could help be a driving force to modernize TIC architectures.
Moses, the other former OMB cyber chief, said he’d like to see DHS and the federal CIO’s office clarify how all of the current cyber programs like TIC, continuous diagnostics and mitigation (CDM), high-valued assets and the Federal Information Security Management Act (FISMA) fit together and what benefits are agencies receiving from them all.
“How can agencies get to good, reduce their compliance burden and how do all of these controls come together and make a difference to secure agency systems and data?” he said.
Kovac said there is a lot of pent up demand for a more flexible approach to TIC. He said several agencies have or are preparing TIC 3.0 uses cases to begin to move out of the current approach.
“I think the vision of this will be a catalog of uses cases,” he said. “The remote worker, the traditional workers, the international users and the bring-your-own-device user. There are 5-to-10 solid use cases so people can find what they want to accomplish.”