A ‘Golden Key’ to Unlock Encryption Is the Wrong Approach


Posted on by Robert Ackerman

This post in our VC-series comes from Robert R. Ackerman, Jr., the founder and managing director of Allegis Capital.

It seems like an appealing strategy—give the FBI and other law enforcement agencies, as well as our spy organizations, a back door— a “golden key”—to unveil encrypted communications to help catch criminals and terrorists and protect Americans from harm. This is particularly compelling in the wake of the recent terrorist attacks in Paris and the role of Islamic State propaganda in the mass killings in San Bernardino, Calif., the worst homeland terrorist episode since 9/11.

When Islamic State commanders find a recruit willing to die for the cause, they move their communications over to encrypted platforms, “going dark,” FBI Director James Comey has said. He has also pointed out that Islamic State militants and other terrorist groups could use encryption to “recruit troubled Americans to kill people” in the homeland.

These are scary points, but a golden key won’t help resolve them. The unvarnished truth is that it is a fictitious panacea, one analogous to fool’s gold.

In the domain of cybersecurity and encryption, the bad guys are just as smart as the good guys. Their tradecraft is focused on identifying and exploiting vulnerabilities. If there is a back door, they will find it and exploit it.  At the same time, it’s hard to imagine that government agencies, which are regularly breached, could be trusted to keep such a golden key safe from hackers and criminals. 

A golden key would also undermine future venture capital investment in encryption solutions, in such companies as Crowdstrike, which is known for outing Chinese and Russian hackers; Vera, which locks down transferred documents; and Keybase, which aims to make encryption easier to use. Venture capitalists will think twice about investing in these and other encryption startups if demand for a back door to encrypted systems undermines the effectiveness of encryption.

Only two months ago, it looked like efforts to demand an encryption back door would be derailed as the Obama administration backed down in a dispute with Silicon Valley over the encryption of data on iPhones and other digital devices. The administration reached the conclusion that it wasn’t possible to give American law enforcement and intelligence agencies access to that information without also creating an opening that state-actors, cybercriminals and terrorists could exploit.

Unfortunately, the White House and congressional staffers have subsequently asked Silicon Valley executives to re-open talks on the matter in the wake of the Paris terrorist attacks. This is at least partly a public relations dance; Washington doesn’t want to create the impression that it’s brushing off the implications of a tragedy. There is no evidence that Islamic State attackers in Paris relied on scrambled communications. But the U.S. Senate Intelligence Committee has theorized that the terrorists likely used “end-to-end” encryption because no direct communications among terrorists was detected, and the Islamic State has created tutorials on how to evade electronic surveillance on the cheap.

Two larger issues are at play here, and both favor the “no back door” viewpoint. One is the Fourth Amendment of the Constitution, which states that “the right of people to be secure in their persons, houses, papers and effects against unreasonable searches and seizures shall not be violated.” Isn’t that the point of encryption?

The second issue is whether a golden key would, in fact, improve the effectiveness of the FBI and other law enforcement agencies. The FBI’s Comey, for example, has focused on the fatal shooting of a man in Illinois in June and suggested that police would have been able to track down the shooter but for encryption built into both of the victim’s two phones. He failed to mention that one of the phones—a Samsung Galaxy S6—isn’t encrypted by default.

A related point is the oft-cited fact that the Manhattan district attorney’s office encountered locked iPhones on 74 occasions over a nine-month period. Bear in mind that the DA’s office handles about 100,000 cases in the course of a year, and then do the math. You’ll see that officials encountered encryption in less than 0.1 percent of cases. Also, the DA has never explained how even one of these 74 encrypted iPhones blocked a successful prosecution.

The golden key issue has put Silicon Valley at ground zero in a tug of war. Apple, Microsoft, Google and other technology companies have been encrypting more of their corporate and customer data after learning the National Security Agency and its counterparts were siphoning off digital communications and hacking into corporate data centers. Law enforcement and intelligence agency leaders counter that such efforts thwart their ability to monitor terrorists and criminals, but Silicon Valley is standing firm.

I’m hardly the only technology industry observer who argues that development of a golden key is a fruitless endeavor. A few months ago, MIT published a paper by leading technologists arguing that it is technically impractical and would expose consumers and businesses to a greater risk of data breaches. “(An encryption golden key) is unworkable in practice, raises enormous legal and ethical questions, and would undo progress on security at a time when Internet vulnerabilities are causing extreme economic harm,” wrote the report’s 15 authors, who included Whitfield Diffie, one of the inventors of modern encryption.

Globally, companies are now sending more than $76 billion annually to protect themselves and often their customers from cyber tracks. We need to deploy the most effective techniques and technologies available to protect our sensitive information and the foundations of our digital economy. This includes encryption that cannot be compromised. 

Our law enforcement and intelligence communities are tasked with a vitally important job, one no doubt made more difficult by advances in technology. But lowering the standards for protecting data and communications isn’t the answer. Rather, we should work on new approaches to identify bad actors. Innovation, not compromising our defenses, is the solution.

 

Robert R. Ackerman, Jr. is the founder and managing director of Allegis Capital a seed and early-stage venture firm focused on investments in cybersecurity startups. Previously, he was a successful technology entrepreneur. 


Contributors
Robert Ackerman

Founder/Managing Director, AllegisCyber Capital, & Co-Founder, cyber startup foundry DataTribe

cyber warfare & cyber weapons law

Blogs posted to the RSAConference.com website are intended for educational purposes only and do not replace independent professional judgment. Statements of fact and opinions expressed are those of the blog author individually and, unless expressly stated to the contrary, are not the opinion or position of RSA Conference™, or any other co-sponsors. RSA Conference does not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented in this blog.


Share With Your Community

Related Blogs