Apple, the FBI, and your privacy under siege

The United States Federal Bureau of Investigation (FBI) wants Apple to create a version of iOS that would allow authorities to breach the strong encryption on the iPhone and iPad and access the personal data contained within. The demand comes as part of the investigation into the San Bernardino terrorism case, but the implications and ramifications go far beyond any one case, no matter how heinous.

Earlier today, Apple's CEO Tim Cook wrote a rare letter to the company's customers on Apple.com. Here's the crux:

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

Make no mistake, what is being asked of Apple should horrify not just those in the U.S. but around the world. Nothing made can be unmade. Nothing used once will only ever be used once. The moment after an easy way to brute-force passcodes exists we, none of us, will be safe. A few criminals may be more easily investigated, but catastrophically more people will be subject to unlawful searches, hacks, theft, blackmail, and other crimes. Everywhere.

Read Cook's letter again, but substitute the FBI for Chinese Intelligence. Imagine China, soon to be a bigger market for Apple than even the U.S., making this demand so they can more easily track and prosecute those they claim to be criminals. Then imagine it being used by governments at war with their own citizens. Now do it again, but this time with Russia's FSB. Or once more with the NSA.

Imagine when it falls into the hands of everyone from organized crime and terrorists to lone hackers and criminals. Imagine falling asleep while the person you just met sneaks into the other room, replaces the software on your phone, and slips out with your every picture, password, message, and location. And if caught, they're just fine — they used the same back door to replace the software with a underground version eliminating the back door.

It's the nature of law enforcement to overreach. To want our every fingerprint on file, all our DNA on record, and one day to want trackers and monitors implanted into all of our bodies. And they have a clear and understandable point-of-view for doing so — their goal isn't your privacy; it's prosecution and safety. But we have to be able and willing to push back against that overreach.

It's the duty of all of us to say, clearly and with unyielding certainty: "No."

Tim Cook, by publicly standing up and voicing his concerns, is doing just that. He's paving another brick on the sunlit path towards justice. We, all of us, need to pave these bricks too, and as quickly as we can.

A Message to Our Customers

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers' personal data because we believe it's the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government's efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that's in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we've offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by "brute force," trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government's demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI's demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI's intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Reader comments

Apple, the FBI, and your privacy under siege

Might i remind everyone here that while is is Apple and the FBI... No one will do a case for Google because it's up to the OEM's weather they choose to do encryption or not.....

So, we're basically in favor of a privacy feature Apple has not always been using before iOS8... That still weights in weather users like it or not. As its only "just come to light". This could stand up in court alone.

It it about one phone, but the FBI is actually blowing it out too much as a "We need a special version without any security features" as a "time to panic" kind of moment, when what they really need it just to unlock the one phone, its always been about that. Either no one knew what they did when they changed the password, or when the worker changed the password they had no idea how icloud works, making a way for an FBI case,

Also i'm all for privacy with Apple, its not like they have been doing this all along in iOS.

Cook has turned this thing into a media circus. We're talking about one iPhone 5 people, which could have been unlocked by Apple, and then given to the govt., without anyone ever knowing what the **** code was. When Apple loses this case, i suggest that everyone who bought a #Stand With Apple t-shirt or jacket, ask for their money back. No matter what Cook says, it's all about the money.

I #standwithapple because every one will become a directive and start sneaking around each others cel phones. And most of keep all our personal details in our phone like bank details, credit/debit card passwords, emails I'd password, business details, to make our work easer and now if this happens then where our privacy will go? So pls don't let this happen.

My understanding of the situation is this. One iPhone 5, has info on it, put there by a terrorist, that will be erased if the right code is not entered within three tries. Apple has the technology to let a person enter an unlimited number of tries, without the phone doing a "factory reset". If that is the case, why doesn't the govt. give the phone in question to Apple, let Apple unlock the three limit part? Then Apple can give that phone back to the govt., and the govt. can find the right code to get the info, without running the risk of the phone being erased. It seems like a win-win situation.

You're not the FBI, you don't have a warrant, and you didn't kill me and take possession of my phone as part of an ongoing terrorist case. Complete the first three tasks, and yes, I'll allow Apple to give you my passcode. The FBI isn't trying to get into some guys phone who is alive. They are trying to get into a DEAD TERRORISTS smartphone, not to mention Apple has already cracked phones for the FBI 70+ times! Google it.

Let Apple keep the phone, crack it, and just copy the contents of the phone and give it to the
FBI. That way they don't have to worry about their software crack getting into the hands of hackers...who would still need your phone to get into it anyhow.

The court orders and the FBI's demands have nothing to do with solving unanswered questions required to justify a false narrative in a case. This is about coercing a private company to create a tool for the government to behave outside the 1st, 4th, and 5th Amendments to break the security of all who deserve it, which is everyone on earth regardless of which sovereignty they live within, whether they are preserving their HUMAN rights using an Apple, Microsoft, Google or other manufacturer's product.

***Three tall athletic men (http://countercurrentnews.com/2015/12/eyewitness-to-san-bernardino/#) in tactical gear (that most tactical operations mercenary contractors wear) were the San Bernardino assassins, not some guy and his wife, who was even smaller and more inexperienced with weaponry than Adam Lanza and the alleged Virginia Tech shooter.***

Debate is great but it's spiralling in to area which the FBI hasn't actually asked for.

What they're asking is for the attempt counter to be frozen at 0 and for a way to test passwords a bit quicker.

The FBI is basically asking for a cheat tool, similar to whats commonly available for video games, that freezes an area of memory or constantly writes a number to that area.

This is absolutely not the same as what others are debating above.

Applauding Apple for resisting is great but it's about 10 years too late for privacy in general.... there's plenty of government sponsored projects started or enabled under different administrations whose goal is to mass surveil everyone whose packets traverse any accessible router on the internet or carrier network. If you haven't seen it already it's worth watching Citizen Four.

The idea that Apple could loose customers because they comply with a legal ruling is a bit bonkers. What customers would they loose exactly? Would I leave Apple because I was worried a court could compel them to unlock my phone and reveal what? My bank details? My porn collection? My high scores? Pictures I took of TV prices in Costco? If I was serious about any of those none would exist on my highly connected portable device that spent the in between times communicating with dozens of services sending god knows what to each.

Everything but the pictures is potentially available externally anyway or had to be sent to the phone in order to store it there. The sending bit is already covered by many many different govenment programs some of whose goals are to capture and hoard data for future analysis. Plug in a mac address, sim card number or combinations of other piecea of reasonably unique data and out comes the history.

This is principle getting in front of practical and not really common sense to me.

I respect Apple decision on this case. Apple is huge company and It is important for them the customers trust them therefore protect customers privacy. If Apple accept "backdoor policy" then they will loose customers and Apple image will be destroyed.

So why doesn't the FBI allow Apple to unlock the phone while maintaining control of the software they use? Apple is seeking publicity. The FBI doesn't need the software, just a court order to compel Apple to unlock a phone under warrant. All Apple needs to do is provide the means to bypass the counter on unlock tries.

Here's something else to consider: Apple (actually all the smartphone makers) may actually be in violation of FCC law mandating that manufacturers build their telephony devices to be tappable. The required court warrant is (in theory) protection of American's Fourth Amendment rights, Now I know there's a lot more to smartphones than the "phone" part, but legally they're first and foremost a phone, like it or not.

So if the next president plays hardball, Apple could have a hard time getting FCC approval for the next iWhatever.

There should be no backdoors. It's like you have 100 people in an apartment and one stole something and the warden doesn't want the police to enter just because he houses a criminal yet the police still want to barge in and demand that they get keys that not only opens the stealers dorm room but everyone else's as well for their safety.

I totally agree with Tim Cook. There is NO excuse. And NO reason why any goverment should have access like this to someone's personal computer or phone. Freedom is too important to be afraid of some terrorists or criminals. They already monitor the network. Getting access to people's phones isn't necessary, and will only create exploitable issues for American people.

Screw criminals. This is not about criminals. This is about the privacy of the user. Doing this would actually give in to Terrorism and Fear ala the Homeland Security, the 'Patriot Act' etc. Might as well give up both our security and freedom.

Creating a backdoor seems to be a separate issue, more along the lines of the government compelling a vault or safe-making company to pick a lock for a vault they made. If in theory they could crack the lock, could they refuse to attempt to do so on the grounds that they can't be compelled to try because, if successful, they'd render all their vaults vulnerable? I think the government wins that argument easily -- even on simple regulatory grounds. There's only a few constitutional rights to privacy that are inviolate and cannot be compelled under any circumstances. These are things like doctor-patient, preist-penitent, husband-wife, and so on. Even with those, or some of them, there might be narrow exceptions.

Been thinking about the whole Error 53 situation. If this was a newer phone the bad guys could deliberately replaced the home button so that the next update would brick the phone, making this "special OS" plan useless. Hmm, putting my conspiracy hat on here, maybe Apple designed it that way so the idea of a "special OS" would be moot.

If Apple wins this fight it could be smoke and mirrors. It's certain that the US government has keys to Android, which is why Google is quiet on this. It's too soon to tell about Apple, Cook could be the real deal.

Its not about the government snooping. I'd care less if FBI, CIA or NSA can snoop in and see what I'm doing online. It's the fact that this "backdoor key" would eventually get into the hands of hackers and then its by by credit rating when they blow out your credit cards. If the government wants this that bad they should be responsible for ALL hacking done afterwards.

Finally, someone who gets the issue. More about security than privacy, although both are equally important imho.

So if the governments of the world demand a way into each and every phone, won't bad actors simply go to third party encryption options, leaving the only vulnerable people those who have nothing to hide anyway? Vulnerable to hackers who could steal, wipe or otherwise mess with their data.

This is a very slippery slope, one which could lead to a very bad place. As an Apple user I'm pleased to see them standing up for their customers. In fact I demand it.

It is only a matter of time before the government creates an executive order or just plane passes a law stating the information on your phone or in the clouds is not your. Then it will be legal to take everyone's information. The government is already taking it illegally, why stop there. It is bad times when the government spends more money monitoring its people than protecting them.

Why is everyone on here so paranoid to think that the government would even WANT to know what you are up to? Good grief...not everyone is that important that others would give a crud about who you are texting, etc. Gotta ask...what are you hiding? Satellite imagery can spot any of us outside our house at any time...get over it.

This generation needs to get their faces out of their phones and talk to people and stop being so ego-centric to think anyone would even care what they are doing privately.

The ruling class that owns the large corporations also owns the (s)elected officials in the federal government, including the presidents (the difference between the Republicans and Democrats doesn't make much difference). It is in their interest to have access to all our personal information--they have always found ways to control the peasants. Propaganda has been their best invention (early 20th Century) and is what has made the USA the most dangerous, rogue entity in history.

It's very interesting that the most successful private corporation in history has a gay CEO saying the company won't cooperate with the ruling class' efforts! (By the way, the 1% don't really care about terrorism--they created it, after all. State Terrorism--traditionally termed "imperialism"--is one of their favorite tools.) As others have said, it will be entertaining to see how this plays out.

In the meantime, millions suffer at the hands of the 1% and believe the propaganda. At least we can check TwitFace and play Bird Crap.

i'm sorry, but what is the connection between the iphone security and your facebook password. if i'm not mistaken, the majority of the hacks are done on pc's and not phones, so you can have a brilliant security on the phone but easy password, you are at risk and probably you will be attached, monitored and hacked.

Strong encryption is the key to out technological development. Today hackers try to steal data from our smartphones and computers, but tomorrow targets will be our cars and homes systems. The precedence of forcing Apple to create a backdoor to one of their OSes, will make our tech future not so bright.

That won't happen in the near term. Right now, collaboration and communication is not happening through cars. The government is looking to access communications that could help them apprehend other terrorists and perhaps understand potential other terrorist activity. And of course, all of that is happening through communication devices like phones, laptops, etc. (not cars).

This is a very interesting topic. I came over here to iMore to see what the discussion was like. Being a Crackberry member, I don't come her often, but wanted to read the debate that would ensue. I do understand both points of view but I do lean in the direction of giving government access to the data if through a robust and defined process. I don't know if that should be through the Supreme Court, but I think there has to be clearly defined criteria in which this can be used and it needs to be agreed to or approved through a trusted body (which again is a difficult thing to define).

I keep hearing about having to load a new iOS version to allow the government to bypass certain security measures. Does Apple not have the master decryption keys? Can't they just decrypt the device and give the data to the government (without obviously handing over the keys)?

No, the whole point is there is no key. The FBI wants Apple to build a special version of iOS that they could update the phone with. This version would not have the ability to brick the phone if too many attempts are made to unlock it. At which point the FBI can use brute force (trying all passcode from 0000 to 9999) and unlock the phone. Of course if the phone was set up with a password it could a very long time to figure it out.

First intelligent Apple Car will be operated by Car OS, just like iOS for smartphones, Watch OS for Watch or TV OS for TV. Apple realizes that consumers trust in safe and secure technology is the key for future development, nobody will drive a car, controlled by software knowing it has weak points. And of course all data from suspicious cars will be crucial for FBI investigations, just like now they are making arguments about smartphones.

It seems the FBI is looking for a way to electronically allow touch codes to be input as well as disabling the erase after 10 times feature.

Is this a "Software Update" that disables these? Would that involve both a special build of the iOS as well as allow it to be verified on Apple servers? Would this verification requirement help keep the genie in the bottle?

I haven't updated a locked phone in a while, does it need to be unlocked prior to installing the update?

Does DFU mode delete data? Does it bypass the lock code?

Is there anything that the jailbreak community could add to help side load this kind of software that the FBI creates in house?

Remember when you saw a line in a movie like "Privacy is often in conflict with the needs of the state", and you thought it was some other country they were talking about? No more.

The needs of the state now trump all personal freedoms.

This time the reasoning is terrorism.

But wouldn't it be the same for a mass murderer?

How about a double homicide. Does that qualify? If its ok on two murders, then how about any murder trial?

Then how about treason or human trafficking? Surely those also qualify.

Where is the line? Suppose I stole 500 million dollars from someone using my computer. Would we stop there?

And does every government on earth get to compromise any mobile phone they have in their possession? Or do we fantasize that this is some sort of US-only power, and other governments will just stand by and not demand the same thing?

Or is Apple supposed to make decisions about this? Or maybe you think the US courts get involved in a these disputes worldwide?

Just sick Jack Bauer on Tim. "Tell me what I need to know" lol Oh well, I don't really take any of this at face value. It could be all smoke and mirrors. I'd be kinda shocked if none of our intelligence agencies didn't have some method of already doing this.

Many of the 3 letter agencies have methods, but they are usually unintentional holes in security (in the software, or humans) which they discover and keep for themselves, or at least keep secret from the public. i.e.- the Snowden Leaks.

"Nothing made can be unmade. Nothing used once will only ever be used once."

Spot on, Rene. It's that exact thinking that symbolizes what Apple is saying here.... it's not just a matter of Apple not wanting to do this because of their stance on privacy.... it's setting a precedent - they can not and will not because it opens a door to future abuse...

Oh god pipe down with that Google will sell you nonsense, please.
Seriously how on earth do you think Apple get market research data? I promise you is isn’t JUST from WATCHING THEIR own CUSTOMERS habits.
As no doubt with Google et al, they use all sorts of holding/phantom companies to distribute money and manipulate information amongst and between themselves.
Oh, so Company A doesn't allow you to do B, but guess what it owns company X and that does it, or buys the service/item from company Y instead and this makes the circle back to A by going through company Z in the cayman islands or somewhere ridiculous.
The corporate sharks are ethical when it suits them.

Back to the topic at hand, it’s a very very difficult balance. You need to put yourself in the shoes of one who has been saved by the use of such tactics and guess what………suddenly you think different, if you pardon the pun.
I don’t know where I stand but the only part of Tim Cooks recent boring and contrived novel that is correct is where he says that this needs to be discussed. In depth.

All companies are interested in protecting their customers, it doesn’t necessarily mean that they care about them.

It doesn't matter if the government cares about what I do. If the government can shut down any agitator using would-be (should-be) private information, I have a huge issue with that. I don't care just about myself.

Oh but they do. And they track everything about you. Few years ago I was definitely in the camp of "I'm not doing anything wrong, they don't care about me. Why should I worry?" But after I did some research, it's clear that information collected by government effects you in many instances. I'm not gonna go into details in a comment but you should look it up. It really opened my eyes on this.

Then why was the NSA recording and capturing literally thousands of text and phone conversations without a warrant? Is it worth it if someone gets hold of your credit card information and puts you through years of recovery? Will you really be safer? Just because you are willing to surrender your freedom doesn't mean every else is willing to do the same.

Does it really ******* matter if the government could give 2 rat ***** about someone who is just a nobody?? Hey I personally have nothing to hide and the government is more than welcome to take a look at the cess pool I made specifically for people who wanna nose around my phone. And believe me I have put stuff on my phone that has scared nosy-*** people out of my life.

Who the **** said it didn't matter to me?? And why on earth should I waste my time by emailing them my passwords?? The government is already doing it to me and you as you can see. I'm better off wasting my time sipping on some lemonade while watching them do it for me, which from the looks of they sure are making great progress (*cough*).

Also, what are you (or anyone for that matter) doing about it?? Honestly, those that aren't doing anything about it (myself included) really shouldn't be complaining about any of this.

You said "...the government is more than welcome to take a look..." so surely you can see how it comes across that it doesn't matter to you.

The whole "email your passwords to the FBI" comment was obviously not meant to be taken literally.

I choose to buy from companies whom I believe are going to be vanguards of my privacy. I'm also on iMore trying to convince people that the "I've got nothing to hide" argument is wrong, and why people should care about the security of the devices and services they use, and the privacy of the information they'd like to remain so.

It's not about you or me. And it's not about the goverment. If one goverment can do it then the next also want and can do it. Also goverments which you maybe not trust. And when they all can do it then also others.
Do you really want to allow anybody to enter your phone, the phones of your kids, of your friends?
A backdoor is a backdoor and this one will have no lock. Did your house have such a door?

The house lock analogy doesn't work in your favour. Those locks have keys. And they sure as **** don't keep law enforcement from from collecting evidence from within should a crime have taken place there, or by you in any capacity. Which would then become reasonable grounds to lawfully check your personal belongings. The problem here is trust or lack thereof with those who have been put in place to protect the population. I get that. But that is a different issue and cannot hinder the ability to actually do something that can/will provide better ways to fight this sort of crime in a new age.