The Core Dump

The Core Dump is the personal blog of Nic Lindh, a Swedish-American pixel-pusher living in Phoenix, Arizona.

[By Nic Lindh on Monday, 30 November 2015]

Magical thinking about encryption and privacy

Predictably, the Paris attacks brought the anti-encryption crowd back out of the woodwork. They're at best being willfully disingenuous.

Almost as disgusting as the Paris attacks themselves were the responses, horrific Islamophobia and thought short-circuiting fear manifesting all over the I-don’t-know-just-do-something-anything spectrum. In other words, exactly what the assholes who committed the atrocity wanted.

Waiting in the wings was, of course, calls for an end to privacy so that mass surveillance will work better. It turns out the particular asswipes responsible for the Paris atrocity weren’t even using encryption—they did their planning in the clear and moved around under their own identities. But that doesn’t matter. In mass surveillance thought space, more surveillance is better and clearly if there had been more surveillance magic would have kept Paris from happening. QED.

Which is face palm territory. But these people were allowed their space in the media to push the idea that clearly a terrorist attack means we must have more surveillance to be safe.

This is an argument you can make: Terrorist and pedophiles are scary, so we shouldn’t have any privacy. I very much disagree, but sure, it’s an argument you can make. But recognizing most people don’t feel that’s a good trade off, the argument these days is that you, solid citizen, can still have your privacy, but the people who protect you will be able to surveil the pedophiles and the terrorists.

Which is either a bald-faced lie or industrial-strength disingenuousness.

Let’s break it down: The current state of the art in Internet privacy is public key encryption. Basically a person who wants privacy on the Internet—like you do when you punch your credit card number into Amazon—has a private key and a public key. Through an amazing amount of high-level math concocted by geniuses, only you and Amazon can read the content that goes between your computer and Amazon. And it happens every time you’re on a website that’s using encryption. It’s the foundation for all privacy on the Internet.

The only way for the good guys to be able to read the transaction that just happened between you and Amazon is for the encryption to be broken—if somebody planted an extra private key or built a backdoor into the encryption software you were using. The software has to be broken.

(There’s also traffic analysis to be concerned about. Even though an eavesdropper may not see your credit card, the fact that your computer and Amazon’s are talking is visible.)

It’s a complete fantasy that there’s some way to put in a way for the good guys to read what you wrote but nobody else. Because, again, the software has to be broken. And what happens when the encryption software is broken? Somebody else will find the way it’s broken and exploit it.

So now it’s not just you, Amazon, and Western state surveillance. It’s you, Amazon, Western state surveillance, and a crime syndicate reading your credit card information.

Breaking encryption is not just something good guys are interested in—there are plenty of terrible, repressive states out there that would just love to read everything citizens are writing so they can throw people into torture chambers. And there are criminals, like the ones who broke into Sony, who would just love to get any kind of information they can for blackmail purposes.

If the public key encryption system works like it should, they can’t. But if it’s broken on purpose so the good guys can get in, anybody else who figures out the flaw can also get in.

That’s where the magical thinking comes in: The sheer idea that there’s a way to break encryption in such a way that only the right authorities can exploit the flaw is ludicrous. If it’s broken, it’s broken.

As an analogy of the scale we’re dealing with, look at the iPhone. Apple locks its phones down so that you can only do certain things with it. For example, you can only purchase software from Apple’s App Store. You can’t download software from wherever you want.

Some people are really annoyed by this and figure out ways to jailbreak their phones. Jailbreaking means finding a flaw in the security of the phone to allow for a privilege escalation so you can do whatever you want. This means that every time somebody figures out a way to jailbreak the iPhone, what they’ve actually found is a flaw in the security of the phone.

There are always new jailbreaks. This is the most profitable company on the planet, employing some of the best computer engineers money can buy and they can’t prevent jailbreaks from happening.

Computer security is hard.

Jailbreaking iPhones is mostly low-stakes. (Mostly—obviously shady characters who want to surveil people are also extremely interested in ways to circumvent Apple’s security so they can load surveillance software.)

Imagine the lengths state actors and criminal syndicates are going to go to find the vulnerabilities deliberately put into encryption software to provide backdoors for Western governments. That’s a James Bond-level game and the resources put in play are unimaginable.

So let’s stop pretending there’s a way to break encryption that only the good guys will be able to access.

Vulnerabilities will be exploited.

If you’d like to learn more about encryption, The Code Book is a breezy, fun non-technical primer on the history of ciphers and codes. I highly recommend it.

You have thoughts? Send me an email!