Apple CEO Tim Cook has issued a bold refusal to comply with a judge’s order requiring the company to unlock an encrypted iPhone used by one of the terrorists responsible for the attack in San Bernardino, California.

The Justice Department had sought a court order to compel Apple to unlock the iPhone 5c belonging to Syed Rizwan Farook, accused of killing 14 people at a holiday party at the Inland Regional Center in San Bernardino on 2 December 2015.

Both Farook and his wife, Tashfeen Malik, were killed in a gun fight with police, but the FBI says it needs access to Farook’s iPhone to determine who he was communicating with prior to the attack.

Sophos Home

However, Cook contends in an open letter to customers, published yesterday on Apple’s website, that such an order would require Apple to create a new version of the iOS operating system – one with a backdoor.

Making things even more difficult for investigators, or anyone else attempting to access a locked iPhone without the passcode: entering 10 incorrect passcodes will automatically wipe the device if the “Erase Data” setting has been turned on.

Creating a backdoor to bypass Apple’s encryption would “[threaten] the security of our customers,” Cook says, and a backdoored version of iOS – which “does not exist today” – would be “too dangerous to create”:

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

However, Apple has been willing to cooperate with law enforcement in general and in the San Bernardino case specifically.

According to Cook’s letter, Apple complies with “valid subpoenas and search warrants,” and has provided data “in our possession” to the FBI when asked.

Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

Apple may be able to turn over data stored unencrypted in iCloud, for example, as it reportedly did in a criminal case last year when the FBI requested access to encrypted communications between suspects in a drug case.

However, accessing data stored on an encrypted iPhone would require Apple’s engineers to build the equivalent of a “master key,” according to Cook.

And once a master key that can unlock any iPhone is created, there’s no way to guarantee that knowledge of how to exploit such a backdoor would not get out and threaten the security of every Apple user.

Ultimately, that’s too great a risk, Cook says:

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

This latest court battle represents another major setback for law enforcement in the ongoing “crypto wars.”

FBI Director James Comey has repeatedly said that law enforcement investigations are thwarted by criminals and terrorists “going dark” through the use of encrypted devices and apps.

Lawmakers from the US Congress, on down to state legislators in California and New York, have proposed laws compelling technology companies to create backdoors for law enforcement access.

But, so far, the tech companies and privacy advocates are winning the fight against backdoors that would weaken encryption.

The fate of unbreakable encryption, and our collective security, may depend on how Apple’s legal fight plays out.

SOPHOS STATEMENT ON ENCRYPTION

Our ethos and development practices prohibit “backdoors” or any other means of compromising the strength of our products for any purpose, and we vigorously oppose any law that would compel Sophos (or any other technology supplier) to weaken the security of our products.

18 comments on “Apple says NO to iPhone backdoor in terror case”

Good for Apple and good for Sophos. While I realize the importance of the data that may or may not be on the iPhone in question I moreso realize the importance of both Apple and Sophos living up, or at least attempting to adhere to the nth degree, their oath of maintaining the highest level of security in their products. Apple and Sophos have put the security of millions of users worldwide both now and in the future ahead of a what-if scenario and for that I applaud them.

You seriously can’t imagine how the information stored on a smartphone would be useful to law enforcement during an investigation? This isn’t about “big government” trying to do evil. Stupid maybe, but not evil. I don’t think they’re suggesting being able to do this without jumping through the normal legal hoops. The problem is it has too much potential to be abused and put others at risk.

While I agree that the information obtained would probably be of some use, and I also agree that the government isn’t likely asking for blanket access in the future, I don’t think abuse by unscrupulous individuals is the only concern here. The government has shown through all of the Snowden revelations that it is more than happy to create a secret court and secret NDA letters in an effort to half-heartedly “police” itself. Give them the keys this time around and they’ll never stop asking (or potentially abusing) whenever it suits them, unfortunately.

Several articles circulating refer to a firmware flash that would disable the password input delay on multiple failed attempt and the data erasure after 10 consecutive failed attempts. This would allow The FBI to brute force The iPhone.

You can read the court documents, including what the FBI are after at a minimum, from the link in the article.

Quite how Apple could/would do that if they wanted to isn’t clear, nor is the issue of whether there’s a middle way in which the “cracking” would be done in a neutral location and all the FBI would get is the passcode.

I guess part of the problem is that Apple would have to hand the phone back afterwards…whether this would leave behind some re-usable “backdoor” code I don’t know…

Even if Apple could produce what the police want without giving anyone outside Apple anything useful, the “backdoor” software would exist at Apple and would be a rich target for hackers or unhappy employees. Even if kept secure it would be a rational for the police (local, state, federal, international, Chinese – how would you rationalize denying the Chinese???) to come back in the future with another iPhone to unlock. Even if Apple destroyed the software after this current incident, the knowledge would still be out there and someone would eventually recreate it.

It seems like the ideal solution would be to ensure future generations of iPhone either have the encryption key embedded with the firmware so that it is removed when the phone is flashed or adding a safeguard so the phone cannot be flashed when the locked…if that is feesible. Smart phone architecture is not my forte.

The interesting part will be what happens when Apple tries to implement that solution in future products now that the federal courts are trying to set precedence.

i will let an American Founding father speak for me.
Ben Franklin Quotes. “They that can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” ”Those Who Sacrifice Liberty For Security Deserve Neither.” ”He who would trade liberty for some temporary security, deserves neither liberty nor security.”

why wouldn’t they go after the network to get a log of who they have been calling and receiving calls from? Even if there was a back door for the Iphone i wouldn’t consider this a valid case to use it. this highlights EXACTLY why such a back door is a bad idea. These investigators need to get there act together. searching someone’s private information because it MIGHT have some evidence is just lazy police work.

I believe it is because they are using this to “prove” to the public that encryption is evil and they must have built-in access to do their jobs. In other words, I believe it is a public relations maneuver using a tragedy in an attempt to get the public to demand back doors so politicians will have to respond with legislation. Rahm Emanuel has been quoted as saying something along the lines of “Never let a good crisis go to waste” and this is more evidence of that mentality.

Data they can get without accessing the phone itself: Call history (from the Telcos), Browser search history (if google was used), Possibly location history.
What they don’t get: Locally stored data and encrypted cloud storage.
If I was an FBI guy, I would start with the call history, interview and check data storage access after persuading by a warrant to unlock their phone/PCs themselves. All of which I expect they did within hours of identifying the killer.
I see this as only an excuse for a backdoor to everyone’s equipment.

From what I’ve been reading, it probably is possible for Apple to write code in an iPhone software upgrade that would ignore the switch to erase the contents of the phone after 10 failed attempts. If it is possible and it is possible for some one to make a profit from it, it will probably happen. If those 2 premises are correct, what is the best way for the feds and for Apple to move forward with this?

The Feds have thrown up a thinly veiled “Red Herring” in that there is very likely Zero Information on that phone. After all, the Jihadist perps destroyed both of their personally owned cell Phones and destroyed any computer storage they had. Obiously the County owned cell phone held no interest to the perps themselves. The goal of the feds is all about forcing Apple to provide a backdoor or at least, establishing precedence.

“SOPHOS STATEMENT ON ENCRYPTION
Our ethos and development practices prohibit “backdoors” or any other means of compromising the strength of our products for any purpose, and we vigorously oppose any law that would compel Sophos (or any other technology supplier) to weaken the security of our products.”

Well, after David Cameron have it’s say on this, lets see how much your ethos really will be worth in the future 😉