Apple cannot bypass your passcode and therefore cannot access this data. So it's not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.

Cue takes from former prosecutors and law enforcement, swearing up and down that strengthened encryption would thwart crime-fighting and lead to murders going unsolved, and so forth. Yet, as Micah Lee, Marcy Wheeler, and Kade Crockford have all pointed out, Apple has vastly oversold the privacy it provides to its users. Lee points out that Apple pushes iCloud onto its consumers. Although Apple encrypts the data in transit and in the cloud, it does it differently for each: it encrypts in transit with the user’s passcode, and in the cloud with Apple’s key. Thus the line about how “it’s not technically feasible for us to respond to government warrants…” does not apply to the cloud. Wheeler points out that service providers like AT&T and Verizon have call records and texts regardless of Apple encryption on the physical device. Apple can promise whatever, but AT&T has a history of complying with government requests for information—so much for keeping that information private.

So then why the outrage on the part of those who take the government’s side? Why statements like, “I helped save a kidnapped man from murder, with Apple’s new encryption rules we never would’ve found him?” Which, by the way, is not even true. (The Washington Post subsequently changed the headline and added an editorial note to the piece, but the original headline is preserved in the URL). Orin Kerr at the Volokh Conspiracy rails against law enforcement’s inability to execute warrants on Apple devices, yet the obvious counterpoint is that if Apple did not encrypt user data, or maintained a “backdoor” to the encryption (essentially, a point of insecurity that, in theory, is only used for law enforcement purposes), this would render their customers more vulnerable to theft, hacking, and other malicious criminal activity. This is entirely sensible, given how much more of an attractive target a smartphone will be once more people start using Apple Pay.

A backdoor is, after all, essentially a point of insecurity that, in theory, is only used for law enforcement purposes. But as cryptographer Matthew Green explains, “Designing backdoors is easy. The challenge is in designing backdoors that only the right people can get through. … The problem is so challenging that even the National Security Agency has famously gotten it wrong.”

The feeling of déjà vu here that has not gone unnoticed or unremarked upon—in the 1990s, the government sought extensive control over the use of cryptography, threatening even to mandate backdoors in commercial products and services. Ultimately, the government retreated from its position. Julian Sanchez describes the Crypto Wars here and goes on to explain how backdoors expose users to criminal activity and oppressive foreign governments.

There is a strange tendency, especially among former or current government officials, to press heavily on “What if we need to find the murderer?” Well, then, what did law enforcement do before cell phones, before computers, before the internet? Simply because something makes a police officer’s or a prosecutor’s job easier does not make it good, sensible, or even worth considering at all in the first place. No one in their right mind would suggest door manufacturers should only make doors out of cardboard, just so it would be easier for police battering rams to knock them down.

Law enforcement should not be able to actively make everyone unsafe just so they can get a better clearance rate at work. The discussion, at this point, has gone far beyond the balance of interests safeguarded by the Constitution and the Fourth Amendment. It’s just plain silly.