From the Fort Hood and U.S. Navy Yard shootings to the Edward Snowden affair, insider threat risks have evolved. So too must the offices within agencies designed to detect and defend against those threats.
There are two sides to potential risks. First, the traditional physical security side of it, called ‘guns, guards and gates.’ Second is the information, cyber side. The definition of insider threat is important to note, because it encompasses the whole.
“Insider threats are really looking at the potential for damage to people, and information unwittingly or wittingly and by people to whom you have by definition given access,” Joseph Kirschbaum, GAO director of defense capabilities, said on the Federal Drive with Tom Temin. “So it’s important because we are talking about those people who mean to do harm, and perhaps even more importantly, as a percentage of actual incidents, the people who don’t necessarily mean to do harm but nonetheless do cause harm.”
In 2015, the Government Accountability Office published two reports looking into programs developed by the Department of Defense Insider Threat Management Center (DITMAC). The agency’s investigation was based on the minimum standards for insider threat prevention developed by a 2011 executive order under the Obama administration.
“We found that on the one hand, the Department of Defense had clear structures in place, they have doctrine, they have planning and they have training in place that relates to parts of the program,” Kirschbaum said. “But it kind of doesn’t come together as a whole.”
In other words, the GAO found gaps in how the DoD approached potential technical and physical vulnerabilities that could later open the doors for an attack. A large majority of holes found in the system were based on mundane risks such as employees not following correct security procedures.
Kirschbaum highlighted that programs are more effective when they are connected in some way or another. The information and risk-assessment done by the DoD needed to be pulled together in order to be used to actually prevent the threat.
“They have risk assessments for individual programs out the wazoo [that] tend to be at the technical level, and they are very good in-and-of themselves, but at the heart, we are looking at cutting across so many different things,” he said. “You need to raise [it to] that next level.”
But there are some questions that come up when talking about diving into cybersecurity, such as: What is normal behavior? How can these network areas be monitored closely without impeaching on an employee’s privacy?
“One of the major considerations and all elements of the insider threat program is determining where the balance is between personal security, personal liberties and personal responsibility,” Kirschbaum said. “[But] just establishing that baseline of what is ‘normal’ is a horrendously difficult task, technically and otherwise.”
The risk is not just found within closed networks either — and agencies have a responsibility to keep their eyes open for any clues. But, as Kirschbaum pointed out, federal employees are used to the general warning that they are being monitored when on government networks.
But an employee’s use of social media outside of work is a different story. The government doesn’t necessarily monitor an employee outside of work, but clues to potential risks can be found on these unclassified networks as well.
“I usually remind them [that] it’s probably a good idea to monitor what you do on social media because that’s a good indication of who you are,” Kirschbaum said. “And this is going back to the fact that this is a technical and human problem.”
He said there are tools of varying degrees that can be used to alert their teams to potential threats, including flag words, phrases, actions, etc. But, in some cases, the information is personal.
“It doesn’t take a whole lot of vision to appreciate the potential weaknesses of information systems and what damage might potentially be caused by someone who you’ve given access to,” Kirschbaum said. “But you don’t necessarily know when that person may decide to do something nefarious.”
This creates a little confusion on when an insider threat task force should act to prevent leaks of information or physical threats, such as the shootings in 2009 and 2013.
Kirschbaum proposed checking these areas before making a decision to act:
What is out of the ordinary?
What is the potential threat?
What are the potential consequences?
and the application of resources.
Despite progress made, the DoD also needs a better system for sharing its results with the public in order to develop a safe system across all agencies.
“If you’re not sharing, [then] your bright example of progress is going to be isolated,” Kirschbaum said. “Just like military history, that tends to be the real trick — being able to develop a system, learn a lesson, adapt to it, pass it on, develop training, pass that on and now you’re spreading success.”