A few years ago, I met with the security leader of an enterprise who was in the middle of a compliance fire drill. During their Payment Card Industry Data Security Standard (PCI DSS) audit, a vulnerability scan uncovered an IP address in the environment that contained a critical vulnerability. However, the vulnerability management team had no idea on which server the IP address was hosted nor where that server was located. They scrambled for three weeks searching for it, only to discover the server was sitting in a server closet collecting dust. The server had payment card information on it, but because the OS (Windows 2003) was discontinued long ago, no one was paying attention to it. The vulnerability team assumed the server was decommissioned. The IT team, not knowing about the vulnerability, decided to let the server run until it died. Neither party talked to one another about the actual state of the server and as a result, it sat in that closet for two years with an exploitable vulnerability. Lucky for them, the vulnerability was not exploited.
Unfortunately, this scenario happens more often than you may expect. Older systems are ignored, no longer considered exploitable due to their age. Vulnerabilities sit on them for years, creating ample opportunity for a cyber-criminal to attack. Meanwhile, vulnerability and IT teams do not communicate, assuming one or the other is taking certain actions to minimize exposures.
Financial companies are no stranger to this issue, and for them the pressure to remediate the most critical vulnerabilities is even greater due to the type of information they hold and mounting industry regulation (i.e. Sarbanes Oxley (SOX), PCI DSS, the New York State Department of Financial Services Cybersecurity Regulation and more). According to a recent survey conducted by Enterprise Management Associates, nearly half (47%) of cyber security professionals working in the financial industry say they suffer from very high to high stress levels due to job responsibilities. 82% say they are overwhelmed by the volume of vulnerabilities they manage, surpassing other industries including government, retail, infrastructure and manufacturing.
Companies run scans on all types of machines, but due to a lack of resources coupled with the sheer volume of vulnerabilities and deadline-driven compliance regulation, they only prioritize vulnerabilities that are marked critical and live on a current operating system. When reporting time comes, they hide everything else. Upper management and board members do not have a clue about the more than half million vulnerabilities on older servers because the vulnerability management team doesn’t look at them in the first place.
From a compliance perspective, vulnerability management teams must tackle vulnerabilities marked critical and high if they want to avoid a hefty fine. There is no incentive for those teams to look at old servers since many of the vulnerabilities on them would typically be marked medium to low. And that’s exactly the problem with the criticality model.
Prioritizing vulnerabilities solely based on criticality ratings forces companies to ignore older vulnerabilities that may have a higher impact on the organization because they are actively being exploited. Even though a vulnerability on an older server is marked medium, if the information on that server would cause a significant financial loss to the company if compromised and there is evidence of exploit activity against that server, that vulnerability should be bumped to the top of the remediation priority list. For example, if a server, no matter the age, contains PCI DSS and/or SOX related data and there is evidence of exploit events against it, the potential for financial loss on that server is higher and therefore vulnerabilities on that server should be remediated first.
Companies must shift the equation from criticality to impact so that their most valued assets – whether old or new - are protected first. They must also keep their IT asset and cyber risk information in a centralized place so that everyone is looking at the same set of data, nothing can be hidden or modified, and everyone is on the same page regarding their risk status. As I mentioned in the example above, the IT and vulnerability teams did not talk to each other and as a result neither party realized for two years that an unpatched vulnerability was on a high value, albeit it old, asset.
Impact, transparency and communication. Those are the key ingredients to successful vulnerability management, not age. Bad actors understand older systems tend to be the most vulnerable because they are overlooked, and therefore see dust as an opportunity.