FedRAMP: Moving beyond basic first aid to surgical interventions

Jason Weiss, the chief operating officer of TestifySec and a former chief software officer for the Defense Department, explains why agencies need to change thei...

Teachers in Wilkins Elementary School in Amherst, N.H., recently taught their second grade students a lesson on password management and what it means to be hacked. As the day progressed, the children sought to apply what they learned. When one student inadvertently zoomed in too far and couldn’t figure out how to zoom out, they loudly proclaimed that they had been hacked! The speed, scope, impact and frequency of cybersecurity threats is so pervasive that cyber risk is an all-of-nation problem, so much so that cyber hygiene is part of the educational journey of seven- and eight-year-old children.

The digital data we create daily is immensely valuable, and it is reasonable to want to protect it. The federal government’s epiphany on the value of data can be traced to the 9/11 Commission Report: “We learned of the pervasive problems of managing and sharing information across a large and unwieldy government that had been built in a different era to confront different dangers.” The government believed that an increase in data sharing activities should correspond with an increase in statutory requirements to safeguard data. And so government did what government was designed to do: legislate. The Sarbanes-Oxley Act, Federal Information Security Act (FISMA) and Health Insurance Portability and Accountability Act (HIPAA) all created statutory obligations around safeguarding data. In the private sector, The American Institute of Certified Public Accountants (AICPA) introduced System and Organization Controls 2 (SOC2), leading to the creation of contractual requirements on service organizations to demonstrate a commitment to safeguarding data. A plethora of regulatory requirements to safeguard data naturally followed, too.

The government’s present approach to cybersecurity set up by these and other regulatory frameworks is based on lagging indicators that force agencies to continually play catch up to the cyber threats.

And yet, despite all this governance, regulatory and compliance activity espousing the importance of safeguarding data, the federal government, private sector and individuals continue to bleed data from the femoral arteries of their interconnected systems at a fatal rate. At some point, basic first aid must give way to surgical interventions if the adoption of technology is going to be sustained or advanced. Two surgical interventions are being heavily discussed and debated today, including advances in continuous monitoring (ConMon) and the establishment of safe harbor for software manufacturers that affects their software development lifecycle (SDLC) and eliminates their ability to disclaim liability via software license.

In the years since 9/11, cybersecurity has evolved into a complex field that rivals economics. Economic indicators are always lagging indicators. Each month, the Federal Reserve Board crawls over data sets that span everything from consumer price index data to housing start data, unemployment levels to gross domestic product. Trends are evaluated, data interpreted and extrapolated, and ultimately adjustments to U.S. monetary policy are made. The astute explicitly understand that the levers pulled this month won’t actually be reflected in the data for several months.

The government’s present approach to cybersecurity is almost identical.

ConMon is one of the primary mechanisms for safeguarding information systems. National Institute of Standards and Technology Special Publication 800-53 Rev 5 establishes that the “terms ‘continuous’ and ‘ongoing’ imply that organizations assess and monitor their controls and risks at a frequency sufficient to support risk-based decisions.” The government’s most visible cybersecurity program, FedRAMP, explicitly requires monthly reporting. In an era where a terabyte of data can be uploaded to the internet in as little as three hours, calling monthly reporting “continuous” is irrational. FedRAMP’s position, the most visible of ways that the government is managing cloud cyber risk, is predicated upon a belief that a relevant and resilient cloud cybersecurity posture is managed and achieved the same way as monetary policy — through monthly report ingestion, discussion and an eventual reaction.

The first surgical intervention, then, is for NIST to formally redefine ConMon. Can anyone actually present a compelling argument that a networked information management system can safeguard data against modern cyber threats using a report filled with lagging indicators? Federal government authorizing officials must demand access to near real-time monitoring with clear thresholds that trigger automatic alerting. These thresholds must range from seconds to minutes – at most – not days or weeks. Machines are better suited for persistent stare than the human eye glazing over a monthly report.

The genealogy of all digital data traces back to the software application that created it. Artificial intelligence systems are software systems, and AI models are built from digital data. It’s a closed loop system, where software creates data, data provenance must be curated and managed before being fed into AI models, and AI models are deployed as software back into real-world systems.

If the software that creates, manipulates and manages data cannot be trusted, isn’t it reasonable to assume that the digital data it creates cannot be trusted?

This leads to the second surgical intervention that is predicated upon a controversial idea that development enclaves and the software development lifecycle must be more secure than the production enclaves where the software product actually runs. Why is this controversial? What engineering leader among us hasn’t at some time proclaimed, “It’s ok – it’s just development.”

Supply chain attacks are accelerating in complexity and impact. A failure to demand provenance and nonrepudiation characteristics across the SDLC is not residual risk; it’s overlooked risk. Returning to a finance analogy, all U.S. public companies must follow U.S. Generally Accepted Accounting Principles (GAAP), and, in fact, most businesses follow it to meet expectations from banks, investors and their accounting professionals. U.S. GAAP contains a set of descriptive principles that offer finance professionals freedom of navigation within a set of well-understood guardrails. Similarly, NIST SP 800-218, Secure Software Development Framework (SSDF) offers software manufacturers a set of descriptive principles that recognize and embrace the heterogeneity of modern software development.

Industrial era techniques predicated upon monthly reporting cycles, interpreting and extrapolating data and then adjusting the posture marginally work well with monetary policy. In a software-defined era, applying the same approach to software systems powering government is the equivalent of placing a small bandage on a bleeding femoral artery. FedRAMP processes must acknowledge there is risk in the SDLC and that ConMon must be implemented through persistent stare with alerting.

Jason Weiss is the chief operating officer of TestifySec and a former chief software officer for the Defense Department.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories