In 2013, the MITRE Corp. began publishing its ATT&CK framework, which provides a living knowledge base of threat actor tactics and techniques. As evidenced during November 2018’s first-annual ATT&CKcon, organizations are finding innovative use-cases for the framework. One use suggested by MITRE is to compare computer network defense capabilities while another is adversary emulation.
Introduced at the correct point in the risk management life-cycle, these applications can inform risk identification, assessment and response.
Matt Shabat: DHS database to help cybersecurity insurers
Using ATT&CK, an economist within the Homeland Security Department’s Cybersecurity and Infrastructure Security Agency’s office of the chief economist developed a break-even analysis model that balances the costs of preventive controls against incident response and recovery.
The model offers a “menu” of controls tied to the adversary techniques they address as well as a “menu” of response actions that would be expected depending on the techniques invoked by a threat actor. An organization can model cost trade-offs between prevention and response. While this only provides a lower bound for potential investment benefits, any additional cost-avoidance in terms of mission impact, business operations, reputation or intellectual property loss would further support a given investment.
The Government Accountability Office consistently includes federal cybersecurity as a focus area in its biennial “High Risk Series.” Similarly, the Federal Cybersecurity Risk Determination Report and Action Plan(i) indicates only 26 percent of agencies are managing their cybersecurity risk. It also notes agency spending likely increased by $700 million from fiscal 2016 to 2017. According to the report, this spending occurred absent “a sense of prioritization or actual return on investment in terms of reducing cyber risks.”
Yet a variety of risk management frameworks are available for agency adoption, including ISO/IEC 27005 and COBIT 5 for risk. Of course, agencies will likely center their efforts around the National Institute of Standards and Technology’s special publication no. 800-30 Revision 1, which addresses risk assessments, and NIST special publication no. 800-39, which focuses on information security risk management. No matter the chosen framework, each provides a foundational structure that proceeds through a life cycle of risk identification, assessment, response, and monitoring and reporting. Once selected, a framework can be enhanced through the adoption of the Factor Analysis of Information Risk (FAIR) approach, which enables decomposition of risk components for better quantitative analyses.
Operationalizing risk management
Using FAIR as a departure point, risk can be decomposed into loss event frequency and loss magnitude. Those can be further decomposed, as necessary, into threat event frequency and vulnerability and into primary loss and secondary loss. Additional levels of decomposition are possible. Most agencies are likely applying a risk framework, but their focus now should be on operationalizing risk management within broader strategic management efforts—combining strategy and risk management with budget information and operationally relevant cybersecurity data. Compared to just a few years ago, federal agencies have access to significantly more data that enable values to be established within FAIR’s categories.
For example, as DHS and individual agencies further deploy continuous diagnostics and mitigation (CDM) capabilities across the federal enterprise, and link cybersecurity operations to the data available via CDM dashboards, the vulnerability component of risk increases in fidelity. At the same time, the governmentwide process for high-value asset identification can be used to illuminate probable loss magnitudes.
As part of this operationalization of risk management, an agency can assess itself for exposures to the ATT&CK components. Among other benefits, that assessment will show gaps and overlaps in risk mitigation. This approach is similar to the analysis afforded by the Defense Department’s NIPRNet/SIPRNet Cyber Security Architecture Review, or NSCSAR, and DHS’s emerging .gov Cybersecurity Architecture Review, or .govCAR. However, costs and monetized benefits are a central feature of the model developed by DHS’s economist, such that cost savings can be found through risk-based decisions around whether to maintain overlapping security solutions. One governmentwide activity where this is important is IT modernization.
Cost-benefit analyses are a must
Legacy information technology systems have plagued the government for well over a decade. Despite the positive IT modernization efforts of the Obama and Trump administrations, one challenge that is often voiced during discussions with agency representatives revolves around modernization funding. Specifically, agencies are concerned about finding sufficient offsets during the five-year window, established by the Modernizing Government Technology Act, to repay the centralized Technology Modernization Fund. Another opportunity to finance IT modernization is through new flexibilities permitted for working capital funds where no repayment is required, but an agency must find sources from which to fill its working capital fund.
Whereas IT modernization is premised on increased efficiency and security and decreased operations and maintenance costs, some modernization projects may not actually pay for themselves over time. At the very least, the ability of those projects to recoup costs over time may appear murky during the project planning and proposal stages.
Modernization project planning will benefit from risk-based investment prioritization that considers cost-benefit analyses, such as those offered through the ATT&CK-based model developed within DHS. Savings can be used to repay the modernization fund or they can be contributed to a working capital fund.