An existing solution to the government’s cyber problems

Ron Ross, a fellow at the National Institute of Standards and Technology, and Bob Bigman, a retired CIA chief information security officer and now president of...

The cyber situation “is not getting demonstrably better over time and will have a debilitating long-term effect on both the economic and national security interests of the United States.”

Ron Ross, NIST

“Every (indeed every) technical approach to the cybersecurity crisis, to date, has come up short due to the failure to understand” the concept of security must be baked in, not bolted on.

Bob Bigman, former CIA CISO


Stunning and depressing words from two cybersecurity leaders in the federal community, both of whom presented Aug. 23 to the Commission on Enhancing National Cybersecurity.

Ron Ross, a fellow at the National Institute of Standards and Technology, and Bob Bigman, a retired CIA chief information security officer and now president of 2BSecure, a consulting firm, dropped these warnings on the 12 individuals expected to deliver recommendations to President Barack Obama, Congress and the next President later this fall.

The President named these experts in April and the commission is part of the Cybersecurity National Action Plan (CNAP). The White House issued the CNAP along with its request for a $19 billion increase in cyber spending in fiscal 2017.

The fact that Ross and Bigman went before the commission and offered such striking and honest assessments of public and private sector cyber efforts surely is a sign of frustration.

“Our fundamental cybersecurity problems today can be summed up in three words — too much complexity. Put another way, you cannot protect that which you do not understand,” Ross told the commission. “Adversaries view the U.S. critical infrastructure and our thriving businesses and industry as a target of opportunity, each adversary with potentially different capabilities and intentions. Increased complexity translates to increased attack surface. This provides a limitless opportunity for adversaries to exploit vulnerabilities resulting from inherent weaknesses in the software, firmware and hardware components of the underlying systems and networks.”

Ron Ross is a  fellow at NIST.
Ron Ross is a  fellow at NIST.

Ross has been working in the federal cybersecurity community for more than 30 years, and leads NIST’s Federal Information Security Modernization Act (FISMA) implementation project, the Joint Task Force Cybersecurity Project and the Systems Security Engineering Initiative.

Bigman brings insights that the average cyber expert doesn’t have from working for the CIA for 30-plus years.

So when both decided to stand up at the commission, it would make sense for both industry and government to give credence to their solutions to the ever-growing cyber threats.

Both Bigman and Ross say the answer to better and long-term security is trustworthy computing.

Bigman said as far back as 1983, the Defense Department and the Intelligence Community recognized the only way to stop cyber attacks was by “establishing principles, rules and, eventually, technical and programmatic requirements for building security and trust into contemporary computing systems.”

The end result was something called the “Orange Book” — the Department of Defense Trusted System Computer Evaluation Criteria.

Bigman said DoD followed the Orange Book with additional series of guides known as the “Rainbow Series,” which focused on creating evaluation criteria for networks, supply chain and application security.

“While the Rainbow series of trusted computer evaluation guides were written (largely) by inside-the-Beltway people, for use by inside the Beltway organizations, many real systems were actually built by industry and at the higher levels of trust were indeed ‘unhackable,’” Bigman told the commission. “While these systems mostly used proprietary firmware and operating systems (thus, why they failed in the commercial market), they succeeded in demonstrating that computing platforms can be built with high levels of security and trust and can dramatically raise the bar for would-be hackers, including sophisticated hackers. Their legacy lives on today in the implementation of many contemporary high-security products like SE/Linux, Trusted Solaris, the Blackberry phone and even the Apple IOS security architecture.”

Bigman said applying these same concepts to the commercial market is very much possible because of the decades of experience and limited examples of trusted code.

Ross took the comparison one step further, telling the commission the concept behind our country’s assurances that bridges or airplanes are safe can be applied to technology hardware and software.

“Security, much like safety, reliability, and resilience, is an emergent property of a system that does not happen by accident. The disciplined and structured approach that characterizes engineering-based solutions is driven by mission and business objectives and stakeholder protection needs and security requirements,” he said. “Those highly-assured and trustworthy solutions may not be appropriate in every situation, but they should be available to those entities that are critical to the economic and national security interests of the United States — including, for example, the electric grid, manufacturing facilities, financial institutions, transportation vehicles, water treatment plants and weapons systems.”

Ross offered nine recommendations that the federal government could implement over the next decade.

Many of his suggestions resemble the approach the Office of Management and Budget took with the 2015 cyber sprint, such as conducting an asset valuation and determining the impact on the agency if it would lose specific data and systems, and putting more data and applications in the cloud as a way to reduce the attack surface and network complexity.

But the one suggestion from Ross around trustworthy computing hasn’t been fully engaged by the White House. OMB Circular A-130 highlights the importance of supply chain risk management, but a pilot to establish a governmentwide set of risk indicators for agencies when doing research on companies has yet to really get off the ground.

According to a 2015 request for quote, the General Services Administration wanted a vendor to provide, “risk research analysis and assessment about selected companies, federal contractors, and/or subcontractors through the analysis of public record, publicly available information, and commercially available data about the contractors. The theory behind due diligence holds that performing this type of investigation contributes significantly to informed decision making by enhancing the amount and quality of information available to decision makers and by ensuring that this information is systematically used to deliberate in a reflexive manner on the decision at hand and all its costs, benefits, and risks. The primary goal of this requirement is to provide risk research on federal contractors and deliverables.”

The discussion about why the pilot hasn’t taken off is for another time.

The larger point here is Ross and Bigman said a significant answer to the country’s cyber problem is not only achievable, but a proved solution.

Let’s hope the commission and White House back these concepts with real action and not just another report.

Return to the Reporter’s Notebook

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Barack Obama

    White House raises cyber stakes with request for 35 percent increase in 2017

    Read more
    Amelia Brust/Federal News NetworkFederal Acquisition, GSA

    GSA building cyber risk profile, indicators for IT acquisition

    Read more