The National Vulnerability Database (NVD) — the U.S. government’s repository of standards-based vulnerability management data — says 2015 was another blockbuster year for security vulnerabilities with an average of 17 new vulnerabilities added per day.
While IT managers can somewhat breathe a collective sigh of relief that the total number of vulnerabilities actually decreased from 7,937 in 2014 to 6,270 in 2015, there’s no time to relax. According to NVD data, 37 percent of vulnerabilities reported in 2015 were classified as highly severe, up from 24 percent in 2014.
What these statistics reveal is that cyber criminals are becoming better at employing increasingly sophisticated methods to exploit IT systems. While the number of attacks may have decreased, the impact of their sinister efforts caused some major headaches at multiple government agencies.
To illustrate, in November 2014 it was announced that the U.S. Postal Service had been targeted in a data breach that exposed up to 800,000 records. Although the breach did not expose any financial data, officials expressed concern that less-sensitive information had been compromised and news articles remarked on the increasing level of sophistication targeting federal systems.
Just a bit over a half-year later, the Office of Personnel Management announced that the agency was the victim of a scheme that stole the personally identifiable information of more than 22 million people.
The theft of highly sensitive information including background checks and fingerprints could be the most damaging breach to U.S. national security ever conducted. It’s hard to put a price tag on this type of breach because the damage is still being assessed and the repercussions of this attack remain to be seen.
Government IT managers are tasked with the ominous responsibility of managing and administering software updates on a daily basis. The IT team not only has to fix bugs and security vulnerabilities in a timely manner to ensure that vulnerabilities are not exploited, they also have to prevent compatibility issues that could make a network unstable, causing system failure or the loss of data. The importance of an effective vulnerability management program cannot be understated as agencies are constantly facing an ever-evolving cyber security environment.
While Microsoft gets a lot of attention because it is both an operating system and application vendor, the company is a good example of how software manufacturers are keeping ahead of the hacking trends by releasing more patches to fortify their robust suite of products. For example, Microsoft nearly tripled the number of patches released from 85 in 2014 to 216 in 2015.
According to data from the NVD, the following applications were the top 10 targets for vulnerabilities in 2015:
# OF VULNERABILITIES
# OF HIGH VULNERABILITIES
Adobe Flash Player
Microsoft Internet Explorer
Mozilla Firefox ESR
The need to speed up the deployment of patches across the government’s highly complex and distributed IT environment is more important than ever.
There’s no doubt that patch management is key to mitigating risk, so consider these suggestions when evaluating the effectiveness of your vulnerability management program:
In order to make a network more secure, you must first identify all of your assets and who owns or uses them. This includes anything that comprises your network such as software, applications, operating systems, routers, switches, servers, firewalls and printers, just to name a few. Because systems are always changing, a repetitive automatic scan of your network is recommended to identify all assets on a continuous basis.
Define your vulnerabilities
It is important to know what vulnerabilities are associated with which assets so that you can determine how to make each one secure. Old software, weak passwords, or missing patches can leave a door wide open for a hacker to exploit a vulnerability through Malware or a web application or operating system that is not updated. After you have a grasp on the types of vulnerabilities within your network, you can prioritize when assets need remediation based on their criticality. A consistent scan of your network can assist with identifying any new vulnerabilities, providing an opportunity to mitigate risk.
Time is of the essence when a vulnerability is discovered; however, a patch should not be applied before it has been tested. A simulated environment allows all mission critical applications to be represented. Patches are generally deployed to the least critical and most easily recoverable servers first. If there is a problem with the patch deployment, these less critical components can be rebuilt quickly from registry backups. By building a replica of the production environment, patch testers can evaluate multiple configurations of operating systems and applications and how they interact with each other before, during and after the deployment process. A backup of all data and server configuration information is a requirement before starting the patching process.
Deploy the patches
Patches can be deployed either manually or automatically. It is advisable to employ an automatic system so that the IT department can protect the network and systems from security threats in a proactive way rather than a reactive manner. An automatic system can provide constant visibility into an organization’s security posture by providing regular reports that are easy to interpret, require less staff to implement and save time and money.
Once a successful patch deployment has been implemented, it is important to continue to scan your networks to ensure that your remediation was successful. Automatic systems can provide an audit trail to prove compliance and will help you to continually assess your network, prioritize your vulnerabilities and remediate when necessary.
Because government IT departments have the responsibility of insuring that all key IT systems are 100 percent available, an effective vulnerability management program is necessary in today’s constantly changing cyber environment. It is important to remember that vulnerability management is not a project with a beginning and an end, but an on-going process that requires constant evaluation of organizational assets, vulnerabilities, testing and patch deployment.
Jose Carlos Linares is the president and CEO of the Open Technology Group.