There always has been healthy tension between auditors and operators. After what seemed a thaw between inspectors general and IT executives over the last few years, a recent event highlighted the continued friction between the two parties in how agencies protect federal data and networks.
During last Thursday’s panel discussion sponsored by AFFIRM in Washington, several CISOs and agency chief information officers talked about the difficulty in moving to a risk-based framework.
Jim Quinn, the lead system engineer for the Department of Homeland Security’s continuous diagnostics and mitigation (CDM) program, said too often IGs rely on checklists to determine whether or not agencies complied with the policy and law requirements.
“They have a standard pro-forma checklist that says ‘Have you done A, B and C?’ with no acknowledgement of whether A, B and C are really things that are important to what you are trying to achieve or whether you have done other things to make those controls less relevant because you’ve put compensating things in that limits your risk on them,” he said. “I think that this is one of the challenges, even looking at things like Federal Information Security Management Act (FISMA) metrics is how do we allow the agencies and departments and the mission groups to really be able to say ‘You have to look at the risk I’m willing to take in the context of what I am doing.’”
Quinn, who spent a majority of his career in the private sector, said these types of risk-based decisions are made often in the commercial world.
“We are not allowing CIOs or risk executives within the government [to make those decisions],” he said. “We nominally say they can, but when push comes to shove and they are going through an audit, that financial audit is going to go through that standard list of all of this stuff and they are going to say, ‘We don’t care that you had this compensating control, you didn’t have the fire extinguishers every 10 feet and you failed.”
David Bray, the Federal Communications Commission CIO, said he recently had a similar experience with auditors.
The FCC had a review of its cloud-based email system. Bray said the auditors said, “you have not thought about what you would do if the cloud-based email went down.”
Bray responded, “The whole reason why we went to the cloud is because it’s a global company. If they go down, we have other issues, but I was dinged. It is sort of like you are saying, they are teaching to the test, and it’s not really more the critical thinking that needs to be done.”
Quinn and Bray’s experiences are not uncommon across the government.
The inspector general community recognized they need to change and has been trying to transition to a new way of thinking.
In the 2015 FISMA reports, IGs used a new maturity model for information security and continuous monitoring to analyze how agencies are protecting their networks and data.
The IGs detail levels 1 through 5 across people, processes and technology.
So when the agency reviews begin coming out late in 2015 or early 2016, we will have a better sense if this frustration is real or just left over from previous audits.
To that end, sources say the Office of Management and Budget are finalizing the FISMA metrics for 2016. The annual FISMA guidance for the fiscal year also usually comes out in early October. Both documents likely will focus on several areas the cyber sprint highlighted such as identity management and access control, patching critical vulnerabilities and reducing the number of privileged users in addition to the continued move to information security continuous monitoring.
This post is part of Jason Miller’s Inside the Reporter’s Notebook feature. Read more from this edition of Jason’s Notebook.