A new partnership between Vencore and the George Mason University aims to develop a cyber risk assessment methodology and tool to help agencies and other organi...
How many times is your agency told to measure and mitigate risks when it comes to cybersecurity because there is no way to protect all your systems and data?
The problem with that concept is the first part, measuring risk, especially around cyber. That is difficult because of the dynamic nature of the threats and vulnerabilities. An application is safe now, but an hour later a hacker discovers a vulnerability and exploits it.
This is where a new partnership between George Mason University and Vencore comes in. Over the next two months, George Mason faculty researchers will work with Vencore to take the initial steps toward developing a cyber risk assessment methodology and tool to help agencies and other organizations analyze the value of the data and determine how best to mitigate potential cyber threats.
“The project itself fits right in with the themes of Defense Advanced Research Projects Agency, Homeland Security Department and other major research initiatives in the federal government, trying to address this question of how do we know if we are secure and what’s the value of the next $100,000 or $1 million of investment for overall security posture and addressing enterprise risk,” said Jean-Pierre Auffret, director of the Research Partnerships and Grants Initiative at George Mason University in Fairfax, Virginia. “These are some of the important questions tied in with some of the challenges with cybersecurity metrics and also some of the challenges with new technologies such as Internet of Things.”
Auffret said the idea of putting a value on data and on protecting that data is something the energy sector and others are starting to do take on more aggressively.
Vencore and GMU will develop the framework during a multi-phase project that includes the design of a high-level data valuation architecture, and the development of models that use this architecture to calculate potential monetary damages of a cyber attack.
“An example of the dynamic aspects of this and the types of evaluation models that there are so you can look at it from a cost of data perspective, of acquiring or replacing the data, or the economic value of what someone would pay for the data, that is where some of the dynamics would come in,” Auffret said. “For example, the value of a credit card number has been decreasing in the marketplace, while the value of health record has been increasing. It’s probably 10-to-15 times higher than the value of a credit card right now. So the risk to systems and risk to an organization changes over time as the market value of their data and information changes.”
John Sutton, Vencore’s general vice president and general manager for its Defense Group, said for the government the value of a data also changes such as a drawing of a weapons systems is more valuable than an email record. But that could change over time as the weapons system is retired and the email record becomes more valuable because it may show what that next major investment may be.
To develop this cyber risk approach, Sutton said GMU will first assess the existing work in this area and then build the right models and algorithms that can be applied to hardware and software. Sutton said the goal is to come up with something that’s similar to the insurance industry in how it measures and puts a value on risks.
Vencore will take on the second part, the front-end enterprise dashboard.
Sutton said if you are thinking of spending $2 million on cybersecurity technology, the framework can help answer if that investment is the best one for your organization’s users, data and enterprise.
“The intriguing part for our development team is to build an app to that easily,” he said.
By September, GMU will finish reviewing the existing models and standards related to system evaluation and also develop a proposal to create a taxonomy for a high-end architecture for data valuation.
Sutton said then GMU and Vencore will determine if phase 2 is possible and if so begin developing and deploying the algorithms and models.
Sutton said Vencore expects to have something for agencies to begin using by the end of 2016 or early 2017.
The concept of cyber risk isn’t new to the government. The Defense Department in November released a guidebook around measuring and mitigating cyber risk in programs.
The General Services Administration also has been working as part of President Barack Obama’s 2013 cybersecurity executive order to create a cyber framework and indicators for acquisitions.
But there hasn’t been a lot of public discussion or work around giving agencies and private sector companies a more rigorous approach to deciding what investments are most valuable.
That’s why Vencore and George Mason’s initiative has so much potential because it could help agencies make their cyber investments more effective and impactful.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jason Miller is executive editor of Federal News Network and directs news coverage on the people, policy and programs of the federal government.
Follow @jmillerWFED