Agencies are ramping up to take on the next major cybersecurity challenge —software assurance.
This is not a new problem, but one that some experts say can no longer play second fiddle to other types of computer and network security.
Federal and private sector experts say there are a couple of reasons why software assurance is getting more attention. First, some estimate that agencies and other organizations spend 90 percent of their cybersecurity budgets on security for the network or infrastructure layer, but only 16 percent of all attacks occur there. Meanwhile, agencies and other organizations spend only 10 percent of the cyber budgets to secure the application layer, but that’s where 84 percent of the attacks occur.
This is part of the growing recognition that more attacks are coming through software programs and agencies need to adjust their attention and therefore their spending.
Insight by Confluent: Learn about how agencies are benefitting from that concept of data-in-motion to improve mission outcomes in this exclusive e-book.
The second reason for this shift is all about the Internet of things (IoT) or connected devices. With the rise of embedded software in everything from cars to ovens to medical devices to weapon systems, experts say ensuring the software is safe is more important than ever.
Additionally, the government is promoting a related need for better supply chain risk management, understanding where the hardware and software come from and knowing whether you can trust the sources from the beginning of the supply chain.
This concept was an essential part of the Office of Management and Budget Circular A-130 update earlier this summer. OMB called on agencies to “implement supply chain risk management principles to protect against the insertion of counterfeits, unauthorized production, tampering, theft, insertion of malicious software, as well as poor manufacturing and development practices throughout the system development lifecycle.”
OMB also required agencies to “develop supply chain risk management plans for all organizational tiers based on National Institute of Standards and Technology Special Publication 800-161 to ensure the integrity, security, resilience, and quality of information systems.”
There are several examples of agencies starting to address these challenges, both at the agency level but also governmentwide.
Ray Lateer, the Marines Corps chief of the cybersecurity division, said the corps is using both its own and the Defense Department’s cyber range to test software.
Lateer, who spoke at the Consortium for IT Software Quality and the IT Acquisition Advisory Council’s Cyber Resilience Summit in Arlington, Virginia on Oct. 20, said one end goal is to great a library where software code gets reviewed and receives a stamp of approval so others in the Marines and across DoD can rely on as safe.
Another example is with continuous diagnostics and mitigation (CDM) program at the Homeland Security Department.
DHS issued in late September a new requirements definition for CDM vendors to meet.
Emile Monette, a DHS program manager focusing on supply chain risk management under the CDM program, said the goal is to take a more comprehensive approach to protecting agency software.
He said the CDM requirements definition is asking the vendors in the program to propose tools and services to let agencies verify the security of the software on their network as well as the software used by the CDM vendors.
Monette said the whole premise is to buy-down the cyber risk from software.
“[T]he primary goal of design-and-build-in security (DBS) is to reduce the attack surface for network and infrastructure components during acquisition, development, and deployment,” DHS said in the CDM document, which Federal News Radio obtained. “To ensure that software acquired, developed, or deployed is functional and secure, DBS has identified a number of secure software and supply chain risk management (SCRM) concepts that can be can be used to evaluate software risk, weaknesses and vulnerabilities, so that agencies will be able to determine the quality of the software and any residual risk before the software is deployed in the field.”
The concepts include trustworthy systems, threat management and static and dynamic assessments.
A third area where agencies are trying to address software assurance is with weapons systems at the Defense Department.
Michael Gilmore, the director of Operational Test and Evaluation at DoD, said too often weapons systems are not designed with security in mind.
Gilmore said DoD has seen that with the Army’s WIN-T program as well as the Air Force’s F35 and Joint Strike Fighter programs. All of which are incredibly dependent on embedded software.
DoD deployed a cyber red team to review a system under the F35 program, called the Autonomic Logistics Information System (ALIS).
Gilmore said the red team found the program office had not thought at all about how to restore pieces of ALIS if they were compromised. He said DoD had to work with the program office for many months as well as with Lockheed Martin, who was designing the system, so they could develop a way to restore ALIS if the red team damaged it. He said there was no real concern even though red teams are very good of ALIS breaking, but Gilmore said the point here is that no one thought about the cyber piece until ALIS was well into the systems engineering effort.
Gilmore said the Joint Chiefs acknowledged the cyber threat in weapons systems and promised to issue new guidance for acquisition and development two years ago, but the guidance remains in the pilot stage and it’s only for new acquisitions and will not be applied retroactively to ongoing projects.
Experts say the onus for software assurance falls on both vendors and agencies alike. Vendors need to know where their source code is being developed and the government needs to conduct the proper oversight when using the commercial code in federal systems.
One of the biggest challenges is training and finding the right employees who understand technology, engineering, the concepts around software design and risk management as well as program or project management.
NIST’s special publication 800-161, as well as the recently released 800-160 around security systems engineering, are helpful. Experts also say courses at the National Defense University help train DoD and civilian agency employees around software assurance. And DHS, for instance, has red and blue teams that help agencies secure their networks, including the application layer.
DHS also has the Software Assurance Marketplace (SWAMP) program that analyzes software code to ensure it’s safe.
Agencies also are looking more at automation to test software flaws as part of the government’s move to agile or iterative development.
Experts say agencies need to do a better job of applying a cyber risk framework to the application layer and further apply the concepts of network security software such as continuous monitoring to be able to find and fix problems as soon as possible.
“The basics really matter. We are finding things like the system accreditation boundary is not well known or understood. We are finding a lot of segmentation issues where agencies have different systems inheriting controls at various levels of trust,” said Martin Stanley, Cybersecurity Assurance Branch Chief for DHS’s Federal Network Resilience office. “The root cause of these problems is a result of basic IT practices and governance around those changes, and different ways the organization has to operate these systems. We’re writing recommendations around enterprise architecture and managing investments. It may sound a little strange from a security person, but that’s really the root cause. Until you have organizational buy-in as supporting this element of EA and managing investments, it’s going to be a challenging time to get people buy into cybersecurity.”