The “Unsafe at Any Speed” moment has come for cybersecurity.
For those of you who don’t remember, the 1965 seminal book by Ralph Nader forced Congress and the executive branch to take action — for some too much action — to ensure standard safety measures for cars.
Time magazine said in a 2015 article for the 50th anniversary of the book, that Nader’s research was the catalyst for President Lyndon Johnson to sign “two auto-safety bills into law and established the National Traffic Safety Agency, despite the auto industry’s outspoken desire to regulate itself. The new laws addressed a wide range of problems from safety codes and vehicle inspection to highway design and driver education.”
Insight by LookingGlass: Federal technology experts provide insight into how agencies are approaching cybersecurity in the new virtual climate in this exclusive executive briefing.
The New York Times said in a similar 2015 article that the book’s “sharp-edged theme … was that the auto industry was ignoring ‘moral imperatives’ to make people safer.”
Consider the Cyberspace Solarium Commission’s report the most recent version of “Unsafe at Any [Network] Speed.”
The final report, which the commission released in early March, outlines 75 recommendations to improve the federal government’s response to a major cyber-attack.
Among the recommendations that could finally change the current and future view of cybersecurity and make government, industry and even consumers truly grasp the importance of cybersecurity is found in Section 4.3: “Congress should establish a Bureau of Cyber Statistics charged with collecting and providing statistical data on cybersecurity and the cyber ecosystem to inform policymaking and government programs.”
The bureau, much like the National Safety Transportation Board or the Bureau of Labor Statistics, would arm organizations and citizens with independent data to decide what investments make the most sense based on risk.
“This idea has been talked about for several years now, but what was gnawing at me is if we do what we think is adopting cyber best practices and latest technology, how do we know that we will be much more cyber secure?” said Rep. Jim Langevin, D-R.I., in an interview with Federal News Network. “The reality is we would be better served by having hard data to make informed policy decisions or for CEOs and [chief financial officers] and chief information security officers when making recommendations or decisions about what type of cyber technology to purchase and deploy. It just makes sense.”
While Nader’s book led to more stringent regulations of cars, the commission isn’t calling for more regulation. It’s calling for a better understanding through data of what works and what doesn’t.
The commission said the confusion and lack of independent data “limits the ability of the government to evaluate the effectiveness of its cybersecurity programs and prevents private enterprises and insurance providers from being able to adequately price, model, and understand cyber risk. Existing data sets are incomplete and provide only a superficial or cursory understanding of evolving trends in cybersecurity and cyberspace.”
Creating such a bureau, which Langevin said would most likely take an act of Congress to ensure funding and staffing, would help answer the long-time question of how to prove something didn’t happen: “What did we get from this cybersecurity investment?” asked many CEOs, CFOs and other non-technologists over the years.
Answering that question has been difficult for many chief information officers and CISOs.
Gus Hunt, a former chief technology officer for the CIA and currently the managing director and cyber strategy lead for Accenture Federal Services, said while the numerous cyber breaches — whether at the Office of Personnel Management or Target or JPMorgan Chase — have made the discussion between technology and non-technology executive easier, data would further unchain the doubters.
“What this will do for the CIO/CISO communities and the CFO/[chief operations officer] communities is make it easier to convince that this must be a continuing investment. It is essential for security and the well-being of your organization,” Hunt said. “It can help demonstrate this through statistics by having government-backed data, which enables organizations to have a conversation that then provides the basis to make some informed decision. They can see what they are worried about and then make investments to deal with the problems more effectively.”
The commission said the Bureau of Cyber Statistics could live in the Commerce Department, or another agency, and inform national risk, help the insurance industry create more accurate risk models and help the federal agencies craft more effective cybersecurity policy and programs.
Wyatt Hoffman, co-authored a paper for the Carnegie Endowment for Peace on cyber insurance in 2018 and now is a research fellow at Georgetown’s Center for Security and Emerging Technology, said as more organizations buy cyber insurance — Androit Market Research found in a recent report that the market will grow from about $4 billion in 2017 to more than $23 billion by 2025 — the need for better data and metrics will only grow.
“It’s a problem for insurers who are trying to model and quantify cyber risk to understand their exposure and determine premiums, and it’s a problem on a national scale,” Hoffman said. “It’s also a problem for companies trying to figure out which products to choose from to fit their particular risk profile.”
Hoffman said the bureau could help with risk aggregation and creating the macro picture that would help insurers understand specific industry sectors and what steps companies could take to make them more secure.
“In a lot of instances like Equifax or Home Depot or Target, it’s difficult to look at those to figure out what you need to be doing to address your particular situation. We need a better picture of if you were in this industry with this profile, here is exposure you are likely to face,” he said. “There is a lot of data that exists already, but it’s widely dispersed across the private sector. Insurers have data, but there is not a lot of sharing or a lot of aggregation of data at a broad level. There are different information sharing efforts in different sectors, but not a single repository. That’s what you need to develop effective models to create a more robust picture, which is the idea behind this bureau.”
Accenture Federal’s Hunt added there is a definite desire in the cybersecurity market for data. He pointed to what seems to be a never-ending release of reports by cybersecurity and consulting companies analyzing cyber data. But a government-based unit with sufficient legislative backing would bring together a much broader perspective, provide early warning signs and make sharing more secure.
“The biggest issue the bureau would have to overcome is what would it take for government data to become the trusted source around cyber? How do you provide actionable data?” he said. “The Solarium report is asking Congress to establish safeguards against punitive measures for sharing and those sorts of actions would go a long way to mute concerns over sharing and get folks to participate and engage more effectively.”
Langevin said the creation of the bureau is one of the short-to-medium term recommendations from the commission.
“Hopefully we will be past this coronavirus crisis sooner than later and once we do get back we will have hearings and introduce legislation,” he said. “We can’t understand what we can’t measure and there is currently a dearth of robust and consistent data. Through this bureau, the data could be analyzed and made publicly available to see, and from there, we would have a better idea of what we can do to get us to be better secure.”
The question now is whether Congress and the administration will see the report as a “seminal call” to get ahead of cyber attacks, or will the document get thrown on the pile with the so many that came before it?