By Daniel J. Solove
Proponents for allowing government officials to have backdoors to encrypted communications need to read Franz Kafka. Nearly a century ago, Kafka deftly captured the irony at the heart of their argument in his short story, “The Burrow.”
After the Paris attacks, national security proponents in the US and abroad have been making even more vigorous attempts to mandate a backdoor to encryption.
The Encryption Backdoor Argument
The encryption backdoor argument has been made and soundly rejected many times, most notably in the 1990s, when the government wanted the Clipper Chip, a requirement for a back door in technology for law enforcement and national security officials to use.
The argument was renewed when Apple and other companies started facilitating end-to-end user encryption. According to national security and law enforcement officials, the back door is needed so that criminals cannot use encryption to shield communications from surveillance.
Now, after the tragic terrorism in Paris, the encryption backdoor argument has become popular with politicians. British Prime Minister Cameron declared: “[T]he question is: are we going to allow a means of communications which it simply isn’t possible to read. My answer to that question is: ‘No we must not’.”
President Obama echoed these statements: “If we find evidence of a terrorist plot . . . and despite having a phone number, despite having a social media address or email address, we can’t penetrate that, that’s a problem.”
FBI Director James Comey stated before Congress that law enforcement needs special access to encrypted communications.
Kafka’s “The Burrow”
Kafka’s “The Burrow” so aptly captures the problems with this approach. An animal builds an elaborate burrow of underground tunnels. Riddled with fear and insecurity, the creature builds the most elaborate burrow — a maze of passages, misdirection, defenses, and so on. He becomes obsessed with building the burrow, constantly doubting its safety, constantly regretting that he didn’t build it with even more defenses. He tries to make it totally secure, but one problem remains: An enemy might invade.
The animal says: “At a distance of some thousand paces from this hole lies, covered by a movable layer of moss, the real entrance to the burrow; it is secured as safely as anything in this world can be secured; yet someone could step on the moss or break through it, and then my burrow would lie open, and anybody who liked — please note, however, that quite uncommon abilities would also be required — could make his way in and destroy everything for good.”
So the animal winds up sleeping outside the burrow to stand guard over the entrance. “My burrow takes up too much of my thoughts,” the animal confesses. “I fled from the entrance fast enough, but soon I am back at it again. I seek out a good hiding place and keep watch on the entrance of my house — this time from outside — for whole days and nights. Call it foolish if you like; it gives me infinite pleasure and reassures me.”
And that’s the irony (or perhaps more aptly put, the absurdity) at the heart of the story — the animal becomes so obsessed with his project of building the most secure burrow that he sacrifices his own security in the process.
Backdoors Undermine Security in the Name of Security
Backdoors are a huge security risk and undermine the effectiveness of encryption for everyone. A report by a group of leading security experts concluded that installing back doors would undermine security by creating an enormous vulnerability: “If law enforcement’s keys guaranteed access to everything, an attacker who gained access to these keys would enjoy the same privilege.”
About 60 of the leading technology companies including Microsoft, Google, Apple, Facebook, and Twitter have vigorously critiqued backdoor proposals because of the significant security risks that backdoors present.
When such a chorus of technology experts and companies point out problems, it is wise to listen. The security of all of our communications is of tremendous importance — and it has national security implications. If the keys got in the hands of bad guys, our financial system could be compromised. People who have access to critical systems could be blackmailed. Key research and intellectual property could fall into the wrong clutches. Private communication is not antithetical to security — it is essential to security.
Encryption is a tool that can certainly be used by the bad guys, but it is also a tool that is primarily used to keep the bad guys out. Creating a major vulnerability that will make every hacker salivate will not keep us more secure. And the terrorists, instead of saying: “Darn, the governments of our enemies now have a backdoor,” might be saying: “Great, a backdoor we can use ourselves to attack.”
Let’s not open up a backdoor to protect the front door. Let’s not make ourselves less secure in the name of security.
* * * *
This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy training, data security training, HIPAA training, and many other forms of awareness training on privacy and security topics. This post was originally posted on his blog at LinkedIn, where Solove is a “LinkedIn Influencer.” His blog has more than 900,000 followers.
Professor Solove is the organizer, along with Paul Schwartz of the Privacy + Security Forum (Oct. 24-26, 2016 in Washington, DC), an annual event that aims to bridge the silos between privacy and security.