The FBI asking Apple to Backdoor an iPhone is a Rubicon for Privacy

The US District Court of California has asked Apple to backdoor a locked iPhone for the FBI. This isn’t a request to unlock a single phone, this is a request for Apple to build a tool that lets the FBI circumvent the security on the iPhone… as in basically all iPhones, which will then set a precedent for all smart phones.

“Make no mistake: This is unprecedented, and the situation was deliberately engineered by the FBI and Department of Justice to force a showdown that could define limits our civil rights for generations to come. This is an issue with far-reaching implications well beyond a single phone, a single case, or even Apple itself.”

In case this is your first time reading about why government mandated back doors are a universally bad idea, here is the quick list:

  1. A digital backdoor, much like a real back door, can be used by anyone, not just those authorized to access it. Back doors make excellent targets for criminals, spies, and other bad actors. These things get discovered, and then they get misused. If you are a criminal, and you are looking to steal data, knowing that there is a backdoor in a system lets you focus your cracking efforts.
  2. Encryption is only good when it’s secure. Insecure crypto is worse than useless because it creates a false sense of safety and control. This is why Digital Rights Management technologies never work. No matter how you slice it, a purpose built entry point is a vulnerability. Once you introduce a back door, or a “Golden Key” it invalidates the security (and value) of the entire system (see point 1). An insecure phone just isn’t worth as much as a secure one.
  3. The bad guys you are trying to catch are bad guys. They don’t give a single runny shit about government regulations. This means that the bad guys who use crypto will simply switch to new illegal tools that don’t have back doors. When the SOPA bill threatened to block DNS for sites accused of piracy, tools immediately began to surface that would defeat the blocks, before the bill was even voted on.
  4. In the case of criminals, government mandated back doors would create a market for secure tools. These tools wouldn’t be Made In America like the *iPhone. Back doors would devalue the iPhone (see point 3) and add value to technologies that aren’t made in the US. Meanwhile, Federal Law Enforcement still couldn’t access phones that belong to terrorists. All the damage done by this would be collateral because the only people affected by this mandate would be innocent bystanders.

There are *tons* of other reasons why back doors are bad, but those are the top 4. Cory Doctorow sums the argument against back doors fairly succinctly in an article in The Guardian:

That’s really the argument in a nutshell. Oh, we can talk about whether the danger is as grave as the law enforcement people say it is, point out that only a tiny number of criminal investigations run up against cryptography, and when they do, these investigations always find another way to proceed. We can talk about the fact that a ban in the US or UK wouldn’t stop the “bad guys” from getting perfect crypto from one of the nations that would be able to profit (while US and UK business suffered) by selling these useful tools to all comers. But that’s missing the point: even if every crook was using crypto with perfect operational security, the proposal to back-door everything would still be madness.

The Law Enforcement community declares war on crypto in one form or another once or twice a decade. Every time they do, we as digital citizens need to stand up and say “NO!” They will keep trying, and we have to keep fighting, every time. It really is that important.

*The iPhone isn’t made in America either, but Apple does employ Americans around the country. Russian mobsters or Romanian cyber-criminals presumably don’t employ many Americans.