It seems that lesson about sharing we all learned in kindergarten appears to be front and center in the debate about information sharing as it relates to cybersecurity vulnerabilities, threats, incidents, and who knows what else. In its perpetual desire to appear to be doing something about cybersecurity, Congress has once again embarked on another ill-fated effort to pass cybersecurity legislation. Given the general opposition in the Republication-controlled House to anything that smacks of more regulation of the private sector, most of the focus has been on either imposing further requirements on the federal government itself or on removing perceived roadblocks in the private sector. One of those roadblocks is believed to be the liability risks that companies face when they share information with the government. Since the government has a reputation for leaking that puts the Titanic to shame, one would naturally be concerned that any information shared would eventually find its way to the general public through a Freedom of Information Act (FOIA) request, a mandated court disclosure, a regulatory action, or some accidental leak.
One merely has to pick up the newspaper to find examples of this. Just in the last few weeks, it was reported that the Department of Homeland Security (DHS) released 840 pages on Operation Aurora. However, it turns out that it released information on the wrong Operation Aurora. The requestor had asked for information on the cyber attacks launched against Google and other major websites in 2009. Instead, DHS released information on a 2007 event also known as Operation Aurora that involved tests at the Idaho National Laboratory that demonstrated how a generator could be physically destroyed through a cyber attack. While much of the information released had been known in the industrial control system community for some time, DHS had previously resisted requests for information about the specific remediation measures it was recommending to operators of these electric generators.
But the proposed Cybersecurity Information Sharing Act of 2014 (S. 2588) promises to alleviate industry concerns. It would grant some immunity from lawsuits for information about “indicators and countermeasures” shared with the government and private sector entities. The logic is that if someone’s private information accidentally got mixed up in information shared with the government, and the information later became public, the private entity doing the sharing could not be held liable if the information was collected and shared in good faith. While we could get into an endless debate over semantics in defining good faith, indicators, and various other terms, let’s at least grant the premise that liability protection would in fact be granted. Does that mean that we’ll actually have a lot more useful information sharing? Sadly, the answer is probably no. I say that not because I don’t believe that information sharing is useful. I believe it is critically important. However, the legislation adds very little to making that happen. It removes a very small barrier while leaving many others.
To start, organizations have little understanding of exactly what should be shared. Is it the Internet Protocol (IP) address representing suspicious behavior or is it the remnants of a piece of malicious software found after a compromise? Or alternatively, is it a narrative describing what happened after a hacker broke in and destroyed an electric generator? The answer may be all of the above, but that’s the problem. Privacy advocates can have a field day with something so open-ended. Instead, DHS and other government entities should focus on specific use cases that provide detailed examples of the types of information to be provided, how it will be used, and the methods for sharing both to government and private sector entities. Many will argue that construing the use cases too narrowly ties the hands of incident responders. But it is a starting point, and we may find that with greater specificity industry may be more willing to share without new legislation. By defining and implementing a sharing process, we’ll all have a better understanding of where legislation is needed and hopefully show there is some value in participating.
But regardless of government action, the private sector is not standing still. For example, the Financial Services Information Sharing and Analysis Center (FS-ISAC) is rolling out software called Avalanche that will standardize the way ISACs and other entities share information in a machine readable way using the Structured Threat Information eXpression (STIX) markup language. Several other ISACs, including the Industrial Control Systems ISAC, are participating. Simply by setting up this infrastructure, organizations have a better idea of the kind of threat data they are likely to receive, and more importantly, the kind of information that would be worthwhile to share.
To learn more about these new opportunities to share and act on real-time threat information, I encourage everyone to attend the ICS-ISAC Fall Conference in Atlanta from September 17-19. You can find more information here. For more information on Avalanche, go to http://avalanche.fsisac.com.