Moving federal information systems to the cloud could reduce a lot of the federal government’s IT “clutter,” but without the cybersecurity component, agencies won’t feel confident about migrating their data.
More than a year after massive data breach at the Office of Personnel Management, federal agencies remain on full alert when it comes to shoring up their cyber vulnerabilities.
Ron Ross, a computer security fellow at the National Institute of Standards and Technology, told crowds on Tuesday that faster migration to the cloud could make federal data systems more secure from malicious attacks.
“We can really take advantage of cloud computing by moving some of the systems and applications to the public cloud as quickly as we can so we can thin the herd, and the things that remain behind within our federal infrastructure, we can do a lot better job of protecting those systems and those assets that we really value,” Ross said in a keynote address at the NIST Cloud Computing Forum.
In order to manage and minimize the risk of cyber intrusion, Ross said agencies need to reduce the “clutter” of information systems to get a better inventory of what they have, and what needs the highest degree of security.
“One of the things that holds cloud computing back, and especially the federal journey into the cloud, is the notion of security,” Ross said.
In Sept. 2009, Federal Chief Information Officer Vivek Kundra announced the Obama administration’s major cloud computing initiative, which aimed to reduce infrastructure costs. But seven years into that strategy, Ross said agencies aren’t keeping pace with the growth of the cyber threat landscape.
“We really do need need effective security to help us take maximum advantage of cloud computing, and you can also now extend that to the internet of things. There’s a massive convergence going on, as many of you have seen, into cyber physical systems. We’re seeing a convergence of two different worlds. In essence, computers are going into everything that we can put them in today and we’re connecting up the world. And that’s going to be the world we’re going to live with for the foreseeable future,” Ross said.
More complexity, more vulnerability
As federal information systems grow in complexity, Ross said they become a bigger target for adversaries.
“As your complexity grows, the attack surface grows,” he said. “Bring me all your known vulnerabilities, fix all of them today or next week, and I’m going to bring you another vulnerability the day after you fix your last known vulnerability. And the day after that I’m going to bring you 10 more. And the day after that I’m going to bring you back 100 more. That’s why the unknowns are increasing at an exponential rate, and that is a no-win situation.”
Not only are the threats more numerous, but they’re also harder to detect. Ross said from a day-to-day perspective, most users operate “above the water line,” meaning they aren’t aware of the cyber intrusions going on behind the scenes.
“How much control do you have on things below the water line? Not a whole lot. This is why, when I talk about … the engineering aspects of this problem, it’s a problem that can only be solved with government, industry and the academic community all working together,” Ross said.
In the case of the OPM breach, Ross said agencies should consider archiving databases that contain sensitive information that doesn’t require frequent and immediate access.
“In the OPM case, the question you have to ask when you’re designing that database and that system, and your accessibility of your people who have to use that data: Do you have to have access to every data item, data element in that database in near-real time or real time? A lot of those records in OPM could have been archived [or] taken offline… is it hard for you to get that information? Yes, it’s also hard for the Chinese to get it as well,” he said.