The request may signal a new showdown between law enforcement and tech companies.

Share this story

Update: Attorney General William Barr ratcheted up the standoff when, according to the New York Times, he declared the shooting an act of terrorism. Barr issued an extraordinarily high-profile call for Apple to provide access to the gunman's two iPhones. He also said that to date Apple has provided no “substantive assistance” in doing so. The development further suggests that the 2016 high-stakes clash pitting privacy against national security are likely to play out again. What follows is the article as it appeared from 1/7/2020:

In a move that may signal another high-stakes clash over encryption, the FBI is asking Apple for help decrypting two iPhones believed to have belonged to Mohammed Saeed Alshamrani, the man suspected of carrying out a shooting attack that killed three people last month at the Naval Air Station in Pensacola, Florida.

The request came in a letter FBI General Counsel Dana Boente sent to his counterpart at Apple on Monday, NBC News reported. Boente said that, although FBI investigators obtained a search warrant to examine the phones, investigators have been unable to guess the passcodes needed to unlock them and decrypt their contents. Complicating matters, 21-year-old Alshamrani fired a round into one of the phones. A second lieutenant in the Saudi Royal Air Force, Alshamrani died in the December 6 shooting. An FBI spokeswoman confirmed the sending of the letter but declined to describe its contents, citing an ongoing investigation.

In a statement, Apple officials wrote: "We have the greatest respect for law enforcement and have always worked cooperatively to help in their investigations. When the FBI requested information from us relating to this case a month ago, we gave them all of the data in our possession, and we will continue to support them with the data we have available."

Company representatives didn't say whether Apple would comply with the government's request.

A new standoff?

Further Reading

The letter and Apple's response may be the first steps in reviving a national standoff over encryption and the responsibilities technology companies face in helping law enforcement bypass it in their products. In 2016, the FBI obtained a court order requiring Apple to help unlock and decrypt the iPhone used by Syed Rizwan Farook, who killed 14 people and injured 17 others in a 2015 shooting rampage in San Bernardino, California.

Specifically, the FBI wanted Apple to create a custom firmware version that would bypass a protection that wipes an iPhone clean after 10 failed attempts to enter a passcode. In court documents and congressional testimony, FBI officials said they had no other way to access the contents of the iPhone so that investigators could determine if Farook and his wife (who also participated and died in the shooting) acted in concert with others to carry out the deadly attack. The government invoked an 18th-century law called the All Writs in seeking Apple's assistance.

Further Reading

Apple vigorously resisted the FBI request. In a spirited letter to Apple customers, company CEO Tim Cook warned that once the backdoor was created, it would pose a threat to all iPhone users. Cook argued that if Apple was compelled to bypass the protections on the shooter's iPhone, it would set a dangerous precedent that would undermine the privacy and security of people everywhere.

"The government suggests this tool could only be used once, on one phone," he wrote. "But that's simply not true. Once created, the technique could be used over and over again, on any number of devices."

Critics of the government request said the FBI reversal bolstered their argument that the extraordinary assistance investigators sought was unnecessary because they had other less-intrusive means to decrypt the contents of the phone. The US Justice Department's Office of the Inspector General later concluded that FBI personnel failed to exhaust all of the remedies available in unlocking Farook's iPhone.

"We believe [the FBI's Cryptologic and Electronics Analysis Unit] should have checked with [Operational Technology Division's] trusted vendors for possible solutions before advising OTD management, FBI leadership, or the [US Attorney's Office] that there was no other technical alternative and that compelling Apple's assistance was necessary to search the Farook iPhone," the report stated.

So far, the government has given no indication it plans to seek an order compelling Apple to defeat security protections in Alshamrani's device. But the request, and Apple's response, are both prerequisites before such an order could be obtained.

The San Bernardino shooters were also dead. Apple was not defending their privacy by fighting the 2016 court order. As the article makes clear, Apple's objection was that once the tool exists to retroactively remove the failed-passcode limit from one device at one time, it is effectively removed for all devices at all times.

It also (not that it matters), raises questions about the purpose of searching through this person’s phones other than setting a precedent to force decryption of other devices.

That part I disagree with. My view is that law enforcement has perfectly justifiable reasons to do everything in its power to gain access to Alshamrani’s phone. He has actually committed a violent crime with terrorist overtones, and it is law enforcement’s job and duty to the populace to ascertain whether he acted alone, or still has dangerous colleagues in sleeper cells somewhere.

The actual moral question is whether this objective is strong enough to justify compromising privacy and security for the non-Alshamranis of the world, by forcing Apple and others to build backdoors. On this part I agree with the general view of the Ars commenters that no, our privacy and security should not be compromised for the sake of a few terrorists.

I can’t imagine that the government really thinks they can stop people from encrypting stuff beyond the ability of the government to recover the information. As long as the people making the encryption are aware of what the state of the art is in decryption it is always easier to add more scrambling than to come up with better techniques and more powerful machines to do the descrambling. Even in famous cases like the decryption of the German enigma messages during the Second World War it would have been trivially easy to defeat the British decryption if the Germans had known what the their capabilities actually were. Nowadays the state of the art is public knowledge, and any terrorist or criminal group worth really going after could probably create their own custom programs without back doors or hire it done easily enough.

Unless the law is that all encryption must have a back door, in which case secure encryption itself becomes a crime.

Powers, once granted to the government, will only ever be expanded upon.

...and still won't do a single thing to slow down actual criminals.

It's like saying it's a crime to rob a bank while wearing a mask. You think the guy holding up the bank will care it's a secondary crime to wear the mask if it increases his chances of success in getting away with the first crime of robbing the vault?

When Bill Barr forces Ivanka and Jared to give us the contents of their WhatsApp conversations with Russian / Saudi operatives, I will take him seriously. Until then, Bill Barr is a threat to democracy and needs to be removed.

The combination of Trump, McConnell and Barr pose an existential threat to democracy in the USA.

But doesn't that imply that Apple is a virtuous entity ? Why should a private company impose its choices on a government (even more, a democratically-elected gov). If we want to have a theoretical debate, let's have that one ?

No, if Apple believed they were a virtuous entity, they'd create a back door and protect the key themselves. Their argument is that end to end encryption is an available thing, and in a democratically elected government with a presumption of a right to privacy, end users should have the means to keep their information private - even from Apple. That's why Apple can't decrypt either.

Their argument is that if you need that information, you're going to have to work for it. Their argument is that the need for that information is rare, so that's not a problem. If the government wants to argue that need for that information should be common, they should have to explain that.

Wouldn't matter what Apple believes about being a virtuous entity. Apple knows that creating a back door and keeping a key creates a vulnerability that can and will be exploited by non-Apple entities.

Apple's commitment to security outweighs (so far) its concerns about being a virtuous entity or not. So no back doors.

I don't follow you. Apple maintains its commitment to security is part of being a virtuous entity.

What are you suggesting?

Apple maintains its commitment to security as a competitive advantage over other OEMs in the free market. If security was instead a competitive disadvantage (e.g. if the security implementation was a substantial inconvenience for users' daily operation, such that they migrate to Android in large numbers), Apple would likely not maintain its commitment to security.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

No, they didn't. Apple doesn't have keys for individual's phones. And that's the proper way to do security.

No, but they should have. Then Apple could help law enforcement solve crimes while protecting privacy for people not strongly suspected for crimes. It does not have to be one or the other.

No because those keys would leak fast and would get requested by every government on the planet even faster.

The only way to prevent mass surveillance is to make is so that they have to do a lot of work to break each phone (FIB/HIM approach to get the local key). That's destructive so no covert hacking, it's expensive so no mass surveillance, and, most importantly, it doesn't scale.

Plus that FBI is more concerned with getting precedent than anything else. They couldn't care less about the crime itself.

Why would the unique key per phone leak? To use them you would need physical access to the corresponding phone. Also, under my proposed scheme using the key would brick the device, so it would be useless for anything other than law enforcement (or maybe state actors).

Here is how it would work:Police: Hey Apple, we have a warrant to access the data in device XXXXXXX.Apple: OK, warrant is ok, here is key YYYYYY. Please recycle the device after you copied all data.

How would you guarantee the "physical access" only feature? Just curious. Cause if you can't, the hacker that will steal the key will have the power to brick every single iphone in the world at will.

Also, your "state actors" include certain murderous regimes that didn't shy from cutting an american citizen to pieces in order to make a point.

Also, your "state actors" include certain murderous regimes that didn't shy from cutting an Iranian citizen to pieces in order to make a point.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

No, they didn't. Apple doesn't have keys for individual's phones. And that's the proper way to do security.

No, but they should have. Then Apple could help law enforcement solve crimes while protecting privacy for people not strongly suspected for crimes. It does not have to be one or the other.

No because those keys would leak fast and would get requested by every government on the planet even faster.

The only way to prevent mass surveillance is to make is so that they have to do a lot of work to break each phone (FIB/HIM approach to get the local key). That's destructive so no covert hacking, it's expensive so no mass surveillance, and, most importantly, it doesn't scale.

Plus that FBI is more concerned with getting precedent than anything else. They couldn't care less about the crime itself.

Why would the unique key per phone leak? To use them you would need physical access to the corresponding phone. Also, under my proposed scheme using the key would brick the device, so it would be useless for anything other than law enforcement (or maybe state actors).

Here is how it would work:Police: Hey Apple, we have a warrant to access the data in device XXXXXXX.Apple: OK, warrant is ok, here is key YYYYYY. Please recycle the device after you copied all data.

How would you guarantee the "physical access" only feature? Just curious. Cause if you can't, the hacker that will steal the key will have the power to brick every single iphone in the world at will.

Also, your "state actors" include certain murderous regimes that didn't shy from cutting an american citizen to pieces in order to make a point.

Also, your "state actors" include certain murderous regimes that didn't shy from cutting an Iranian citizen to pieces in order to make a point.

FTFY.

Hey now, this comment section is for calling out abuses and excesses with regards to American domestic policy.

The thread for complaining about abuses and excesses with regards to American foreign policy is over there.

No, they didn't. Apple doesn't have keys for individual's phones. And that's the proper way to do security.

No, but they should have. Then Apple could help law enforcement solve crimes while protecting privacy for people not strongly suspected for crimes. It does not have to be one or the other.

No because those keys would leak fast and would get requested by every government on the planet even faster.

The only way to prevent mass surveillance is to make is so that they have to do a lot of work to break each phone (FIB/HIM approach to get the local key). That's destructive so no covert hacking, it's expensive so no mass surveillance, and, most importantly, it doesn't scale.

Plus that FBI is more concerned with getting precedent than anything else. They couldn't care less about the crime itself.

Why would the unique key per phone leak? To use them you would need physical access to the corresponding phone. Also, under my proposed scheme using the key would brick the device, so it would be useless for anything other than law enforcement (or maybe state actors).

Here is how it would work:Police: Hey Apple, we have a warrant to access the data in device XXXXXXX.Apple: OK, warrant is ok, here is key YYYYYY. Please recycle the device after you copied all data.

First of all, your idea that using a key will brick the device isn't an option. That would be a blatant violation of due process. Evidence cannot be tampered with and must be available in future should the court case every be reviewed or appealed.

Second. every city in the world has their own police department. They would all want access to this functionality. And not just criminal investigations, private lawyers negotiating a divorce settlement will demand access for example - the law clearly states they are entitled to any data Apple has on any customer relevant to a pending court case.

So, in short, the key to decrypt the data cannot brick the phone, and if Apple has access to the key then they will be required to hand them over to millions of people around the world. No matter how careful Apple is, some of those people will use it for nefarious purposes.

And that's the best case scenario. The worst case scenario is Apple's own security is compromised and the entire database of keys goes up for sale on the black market. Apple does not want to have a database of keys, and no law should force them to maintain one.

I fully agree with your sentiment, however, the first part isn't exactly correct.

If the evidence is destroyed during the testing, generally, it would still be admissible. That was seen quite often with small amounts of drugs. Another is it is difficult to challenge an autopsy when the corpse has been chopped up by the original coroner. Usually, it is only if the destruction was malicious would the evidence be inadmissible.

In civil cases, if one side demands an examination as part of the discovery, the other party does get to attend. The examination of a phone or computer would be the owner is required to unlock it for the other side. Not unlocking the phone would be an obstruction of justice and your side would probably lose the case. Usually though, there is no examination as you would be required to hand over all the material demanded as either a paper copy and / or electronic record minus any encryption.

(Years ago a court ordered a man to open some records in a divorce. The court didn't believe him when he said they didn't exist. He was held in contempt and jailed until he opened his records. He spent 14 years in jail.)

The Fifth Amendment to not incriminate yourself does not apply in civil cases.

The whole problem with leaving a backdoor is it being exploited by unscrupulous actors. For the police not having access is analogous to the phone having been destroyed by fire. It sucks for them to be close, but, the potential for real abuse far exceeds the potential intelligence that may be harvested.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

You're not wrong, but you're drawing the wrong conclusion. You can't expose the keys within the Secure Enclave through software, but you can use the Secure Enclave hardware to decrypt data on a device (obviously, else it would be useless). I'm not sure the setup supports it - it depends on the details of how the encryption is done - but it's possible that Apple provide a version of iOS that leverages the Secure Enclave processor to decrypt data on a device without requiring the user's consent.

EDIT: Looking into it a bit more, it looks like that's not possible. The FBI doesn't request that, though. The last time this happened, they instead requested that Apple disable the limits on PIN entry so that they could brute force the device.

The govt always has a way. Unless the NAND (memory) chip or security enclave processor is destroyed (in which case there is nothing Apple could do anyways) the govt can always desolder both chips. They can read out the encrypted contents of the NAND. They can acid remove the top layers of the security enclave and retrieve the device key by physical inspection. Using the device key they can then brute force the master key which is produced using both the device key and the password derived key. Since it is now just encryption keys and raw encrypted flash there are no longer any restrictions on speed or max tries.

Now this process is expensive and hard and slow and requires physical access to the phone and can't be done covertly and after all that they only have the contents of a single phone but they CAN do it. It doesn't facilitate mass spying or bulk warrantless data collection. If the case actually merits that level of attack it is always an option. The fact that they aren't doing this says a lot about the relative merit of the phone's contents vs using this as a precedent to allow mass warrantless surveillance and other constitutionally prohibited bullshit the three letter agencies have been engaged in.

They've also still got every other investigative tool they've always had since criminal investigations were a thing.

A whole physical world of evidence and people to talk to.

This is 100% about surveillance.

The second the terrorist is dead, IF police had an instant-access tool, they might be able to identify accomplices within minutes and possibly catch them before they escape the country or start on their next attack. There are real reasons this could be helpful for good, responsible law enforcement. No need to disparage them.

While I still come down on the side of privacy, I understand the law enforcement argument.

They definitely have an interest here, but same goes for busting down any door they want without a warrant. Arguments around exigence should be evaluated closely and rejected without a massive amount of evidence of public harm.

The idea that phones represent an inviolable secure enclave of information that can never be penetrated is LE's best argument for access, not exigence, imo. But LE hasn't shown that it can't penetrate devices when it really wants to. This issue is really just about cost, as best as I can tell.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

That's correct, but if it would be trivial for a new version of the OS to simply encrypt your PIN with a master key and stick that somewhere on the SSD without encrypting it with the UID.

It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version. But that doesn't protect us from the nightmare scenario.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

That's correct, but if it would be trivial for a new version of the OS to simply encrypt your PIN with a master key and stick that somewhere on the SSD without encrypting it with the UID.

It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version. But that doesn't protect us from the nightmare scenario.

There's a compromise of sorts available. Apple can sign a version of the OS that:

- Removes the delay on repeat PIN entry.- Prevents the device from wiping data on multiple failed PIN entries.- Enables PINs to be entered by hardware via a port rather than via the touchscreen.

That's what the FBI asked for in 2016, and given that we're talking about PINs here it'd be more than enough to compromise the devices.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

That's correct, but if it would be trivial for a new version of the OS to simply encrypt your PIN with a master key and stick that somewhere on the SSD without encrypting it with the UID.

It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version. But that doesn't protect us from the nightmare scenario.

There's a compromise of sorts available. Apple can sign a version of the OS that:

- Removes the delay on repeat PIN entry.- Prevents the device from wiping data on multiple failed PIN entries.- Enables PINs to be entered by hardware via a port rather than via the touchscreen.

That's what the FBI asked for in 2016, and given that we're talking about PINs here it'd be more than enough to compromise the devices.

Just another backdoor. If the FBI has it, the bad guys will have it in a few hours or days.

consider this administration's unprecedented effort to obfuscate, hide, and keep from the public, information about what it's doing which is not only within the publics rights to know, but in the interest of democracy itself. Much of this to information is lawfully required to be disclosed or hidden without justification. Add to this the blocking of relevant testimony, the rejection of legally issues subpoenas , 'classifying' transcripts, using private email for gov business, absence of press briefings, the 'alternate facts' on and on ..Now consider that Barr and other cronies and sycophants of the republican party are demanding unprecedented access to literally everyones private life, without oversight or due process or even the requirement to disclose when, who and how .. (don't kid yourself about how this would work; if the government was given back-doors to our devices and the infrastructure that stores our data.. yes that would be included)And what excuse do you have for not voting ?

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

That's correct, but if it would be trivial for a new version of the OS to simply encrypt your PIN with a master key and stick that somewhere on the SSD without encrypting it with the UID.

It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version. But that doesn't protect us from the nightmare scenario.

There's a compromise of sorts available. Apple can sign a version of the OS that:

- Removes the delay on repeat PIN entry.- Prevents the device from wiping data on multiple failed PIN entries.- Enables PINs to be entered by hardware via a port rather than via the touchscreen.

That's what the FBI asked for in 2016, and given that we're talking about PINs here it'd be more than enough to compromise the devices.

Code is speech. You can't force Apple (or anyone) to write the code you want, and much less to sign this code as their own.

Even if you could, it would be burdersome (create, test and validate) to create a new OS with these features. It's already a pain to release a bug-free major iOS version.

Even if it wasn't burdersome to create, Apple would be required to maintain it alongside it's consumer version of iOS, since this is already the second time the FBI requests such feature. This would cause an increase in the number of cyber attacks on Apple facilities, since this code would be signed by Apple, and therefore, capable of being installed on any device.

If Apple was required to do so, every other tech company would also be required, and most wouldn't keep 2 versions of their OSs. Consumers would, by default, get compromised versions, and every company would see the same increase in cyber attacks Apple would, and few have the money to guard their gates.

After such version was made, every police department all over the world, every inteligence agency, would require such access, and this by itself would increase the likehood of the code being dumped by a malicious actor and falling in the hands of criminals, making every device compromised - and I'm not saying every Apple device, since every company would have their backdoored version.

Such compromise doesn't exist, because in this particular instance, you either have security or not.

Just a note: when I said "there's a compromise of sorts available", it was in response to this:

"It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version."

I'm not saying that disabling the PIN-based protections on iOS is a valid compromise with the intelligence community. I'm saying that it's a plausible means of compromising (read: breaking) the security on the device.

I get that people are heated on this issue, but you really need to pay more attention to who and what you're replying to. Getting angry makes it easier for people to brush you aside as irrational, and it's really important that we don't let that happen here.

It also (not that it matters), raises questions about the purpose of searching through this person’s phones other than setting a precedent to force decryption of other devices.

That part I disagree with. My view is that law enforcement has perfectly justifiable reasons to do everything in its power to gain access to Alshamrani’s phone. He has actually committed a violent crime with terrorist overtones, and it is law enforcement’s job and duty to the populace to ascertain whether he acted alone, or still has dangerous colleagues in sleeper cells somewhere.

The actual moral question is whether this objective is strong enough to justify compromising privacy and security for the non-Alshamranis of the world, by forcing Apple and others to build backdoors. On this part I agree with the general view of the Ars commenters that no, our privacy and security should not be compromised for the sake of a few terrorists.

Actually, in my view, it's not for the sake of a few terrorists it's for the sake of government gaining new powers over society at large in the *name* of protecting us from a few terrorists. Law enforcement has shown over and over that they're willing and able to forsake our rights for their own sake. Power always seeks more power for power's sake.

Just a note: when I said "there's a compromise of sorts available", it was in response to this:

"It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version."

I'm not saying that disabling the PIN-based protections on iOS is a valid compromise with the intelligence community. I'm saying that it's a plausible means of compromising (read: breaking) the security on the device.

I get that people are heated on this issue, but you really need to pay more attention to who and what you're replying to. Getting angry makes it easier for people to brush you aside as irrational, and it's really important that we don't let that happen here.

I think the headline should read "The FBI is once again claiming it cannot unlock an iPhone and demanding a backdoor"As with the San Bernardino case where the phones were cracked by using other means it appears the FBI is using this as an excuse to demand a backdoor. Based on recent history I see no reason to trust an agency routinely lies and sees nothing wrong with protecting murderers.

This speaks not to the moral underpinnings, just the weakness in having any point of entry.

The Panama Papers... Ill stop now.

ed for grammar and Panama

While I suspect I agree with your sentiment, "I don't trust government" doesn't really add much to the dialogue.

I think you miss my point. I am sorry I have to state it. Perhaps you do not recognize all the stories. My point being that any attempt to store info, any method, analog or digital has weaknesses that humans can exploit, unless they cannot be exploited. (and of course XKCD has the way for those cases- drugs and a $5 wrench) Whether they are a state actor, the USA in Snowden's and the Pentagon papers' case (an OK Boomer here), or a non-state actor in the Panama papers and Twitter. Perhaps those cases and the numerous others should be recited to those who make the decisions, Congress not Barr. I'm a Boomer. I got rid of my absolute mistrust of government in the early 70s.

Your other post read differently. I appreciate the nuance you articulate here.

But doesn't that imply that Apple is a virtuous entity ? Why should a private company impose its choices on a government (even more, a democratically-elected gov). If we want to have a theoretical debate, let's have that one ?

No, if Apple believed they were a virtuous entity, they'd create a back door and protect the key themselves. Their argument is that end to end encryption is an available thing, and in a democratically elected government with a presumption of a right to privacy, end users should have the means to keep their information private - even from Apple. That's why Apple can't decrypt either.

Their argument is that if you need that information, you're going to have to work for it. Their argument is that the need for that information is rare, so that's not a problem. If the government wants to argue that need for that information should be common, they should have to explain that.

Wouldn't matter what Apple believes about being a virtuous entity. Apple knows that creating a back door and keeping a key creates a vulnerability that can and will be exploited by non-Apple entities.

Apple's commitment to security outweighs (so far) its concerns about being a virtuous entity or not. So no back doors.

I don't follow you. Apple maintains its commitment to security is part of being a virtuous entity.

What are you suggesting?

Apple maintains its commitment to security as a competitive advantage over other OEMs in the free market. If security was instead a competitive disadvantage (e.g. if the security implementation was a substantial inconvenience for users' daily operation, such that they migrate to Android in large numbers), Apple would likely not maintain its commitment to security.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

I was looking more for why the other poster was setting up an apparent dichotomy between "security" and "virtue." Thanks for the response--and I don't disagree with your perspective.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

My understanding is that due to the way the Secure Enclave stores it's portion of the key, it isn't currently possible to expose it via software, as the OS doesn't know it - the CPU is offloading decryption to an entirely different chip, and the keys are never visible to the OS.

Please correct if I'm wrong.

That's correct, but if it would be trivial for a new version of the OS to simply encrypt your PIN with a master key and stick that somewhere on the SSD without encrypting it with the UID.

It's possible that present phones are secure in as much as, if locked, they cannot be compromised by a new OS version. But that doesn't protect us from the nightmare scenario.

There's a compromise of sorts available. Apple can sign a version of the OS that:

- Removes the delay on repeat PIN entry.- Prevents the device from wiping data on multiple failed PIN entries.- Enables PINs to be entered by hardware via a port rather than via the touchscreen.

That's what the FBI asked for in 2016, and given that we're talking about PINs here it'd be more than enough to compromise the devices.

That security gap has been closed since the the San Bernardino case. Flashing a new firmware requires the PIN in order to retain access to the encryption keys/data. So they could flash a new firmware that removes that protection, but that would only be useful in the future as access to the existing data would be lost.

The govt always has a way. Unless the NAND (memory) chip or security enclave processor is destroyed (in which case there is nothing Apple could do anyways) the govt can always desolder both chips. They can read out the encrypted contents of the NAND. They can acid remove the top layers of the security enclave and retrieve the device key by physical inspection. Using the device key they can then brute force the master key which is produced using both the device key and the password derived key. Since it is now just encryption keys and raw encrypted flash there are no longer any restrictions on speed or max tries.

Now this process is expensive and hard and slow and requires physical access to the phone and can't be done covertly and after all that they only have the contents of a single phone but they CAN do it. It doesn't facilitate mass spying or bulk warrantless data collection. If the case actually merits that level of attack it is always an option. The fact that they aren't doing this says a lot about the relative merit of the phone's contents vs using this as a precedent to allow mass warrantless surveillance and other constitutionally prohibited bullshit the three letter agencies have been engaged in.

They've also still got every other investigative tool they've always had since criminal investigations were a thing.

A whole physical world of evidence and people to talk to.

This is 100% about surveillance.

The second the terrorist is dead, IF police had an instant-access tool, they might be able to identify accomplices within minutes and possibly catch them before they escape the country or start on their next attack. There are real reasons this could be helpful for good, responsible law enforcement. No need to disparage them.

While I still come down on the side of privacy, I understand the law enforcement argument.

When you can guarantee that all law enforcement is "good, responsible law enforcement", then perhaps we can talk about giving them increased access to people's private communications. Until that point, if you wish to share your communications with the cops, go ahead, but I will resist every effort to force me to share mine.

People seem to forget that 9/11-style attacks are exceedingly rare. The last one with such destruction and death before 2001 was 60 years before on December 7, 1941. The one before that was at least 76 years before that during the Civil War. The "OMG gotta catch the terrorist before he strikes!" trope is from the TV show 24, not reality.

Now, if I ended my post there, someone might think they could play Gotcha! and say, "but but but EVERY TERRORIST VICTIM IS A TRAGEDY AND MUST BE STOPPED BY ANY MEANS NECESSARY!!!!!one"...

And, I'd like to point out that orders of magnitude more Americans are killed by guns than by a bunch of stupid religious nutjob terrorists. For example, this shows 150,000+ gun murder victims vs. 3,000 possible terrorist victims, and that article is from 5 years ago.

But, of course, the people arguing for crackdowns on "terrorism" (however nebulously defined) are usually the same people who have that "You can have my gun when you pry it from my cold, dead hands" bumper sticker from Red Dawn.

Not doggin your arguments either way, but you're forgetting Oklahoma City. While the death toll was lower, it absolutely qualifies a a large-scale terrorist event.

I'm interested in your observation that the NSA may have a zero-day for the Secure Enclave in new Apple phones. Now, Apple gave a lot of "marketing talk" when they made the sales pitch for the physical security inside the new phones, but, tellingly, nothing in their documentation makes any claim along the lines of, say, FIPS-140-2.

Now, if they had said, "The Secure Enclave is FIPS-140-2 Level 4 compliant", that would have been nice to have...

It is FIPS 140-2 Level 2 Compliant - has been for a while. I don't think they mention it in the keynotes (too nerdy) but it's definitely documented on their certifications page.

But the NSA/US intelligence agencies are the single biggest buyer of 0 day exploits for precisely this purpose. But the government makes very clear distinctions between espionage and evidence collecting, and they're not going to burn their espionage tools on evidence collecting, because the former doesn't need to be legally disclosed (foreign), but the latter does (domestic).

It's not that they've necessarily broken ECIES or even SHA-256, it's that they've tricked the phone into decrypting data believing it's been authenticated due to some oversight on Apple's part. Keep in mind, FIPS really says you've designed it well, there's no real certification that you've implemented it well. And then the exploits increase with cleverness - 'hey, we've figured out how to get your DDR4 RAM to give up data that we didn't address', which require the certifications to expand in scope, and so on, but sometimes leaves existing devices vulnerable because it's an exploit you can't patch.

Based on the way Apple has written the software and implemented security, I seriously bout there’s much Apple could do anyway.

Yeah, in all these cases, it's not actually about accessing that one phone. They don't need it. The whole thing is a PR exercise to try to get the public on board with mandating weak security for all devices by law, so that the government can have access to everything all the time forever.

They know that this will mean more bad actors getting access, because of course there's really only two kinds of digital security, the kind no one can break, and the kind anyone can eventually break. They've just decided they want the public to only be allowed the second kind.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

iOS has fuckall to do with the iPhone security stack. All iOS can do is make a request of the hardware - it's all implemented below the OS. Its only role is to send a PIN. FaceID/TouchID are all implemented below iOS. I'm sure aspects of them can be flashed, but I'm willing to bet anything related to security cannot be. Remember, Apple is in the almost unique position of designing their own CPUs and support chips so they can move anything and everything into hardware, and by all accounts they are taking good advantage of that. Apple designs the FaceID/TouchID hardware so they don't even need the typical interfaces that would be required from a supplied component. They can skip every single one of those interfaces, which are where the exploits tend to focus.

The Encryption argument is another example of the cost of living in a free society.

Our criminal justice system...at least in theory...is biased towards the defendant precisely because of the power that the government can bring to bear against an individual. The "cost" is that in many cases the guilty go free because of that pesky "beyond a reasonable doubt" thing.

Publicly available strong encryption is no different. Do you think that the FBI would order a security system of their own and insist that it include a back door so that Congressional oversight committees can peek in when and wherever they want? All it takes is a subpoena from the House. Do you believe that the Pentagon or NSA or the White House should keep their systems deliberately weak and hack-able ?

People are imperfect and so are our institutions. We are all better off if WE THE PEOPLE have access to the same strongest available encryption that the government reserves for its own use. Does that make it harder for law enforcement? Absolutely. Does it mean that some bad guys will get away with things? Probably yes, but for the most part encryption won''t be the deciding factor. Is it worth that cost? That depends on just how much you value your individual freedoms.

An Israeli company that specializes in helping law enforcement agencies unlock cellphones announced it has found a way to break into any iPhone ever made, as well as many Android phones.

The Petah Tikva-based Cellebrite was reportedly the company the FBI used in 2016 to hack into the iPhone of the San Bernardino shooter after Apple refused the US government’s request to build a backdoor into its famously secure operating system.

Illustrative image of the passcode screen of an iPhone. (ymgerman/iStock by Getty Images)

An Israeli company that specializes in helping law enforcement agencies unlock cellphones announced it has found a way to break into any iPhone ever made, as well as many Android phones.

The Petah Tikva-based Cellebrite was reportedly the company the FBI used in 2016 to hack into the iPhone of the San Bernardino shooter after Apple refused the US government’s request to build a backdoor into its famously secure operating system.

The announcement from Cellebrite came in the form of an update this week to its website promoting the iPhone-hacking technology, dubbed “UFED Premium,” as “the only on-premise solution for law enforcement agencies to unlock and extract crucial mobile phone evidence from all iOS and high-end Android devices.”

CASE CLOSED....they can get thru the encryption if they want...they just want to mess with Apple...

An Israeli company that specializes in helping law enforcement agencies unlock cellphones announced it has found a way to break into any iPhone ever made, as well as many Android phones.

The Petah Tikva-based Cellebrite was reportedly the company the FBI used in 2016 to hack into the iPhone of the San Bernardino shooter after Apple refused the US government’s request to build a backdoor into its famously secure operating system.

Illustrative image of the passcode screen of an iPhone. (ymgerman/iStock by Getty Images)

An Israeli company that specializes in helping law enforcement agencies unlock cellphones announced it has found a way to break into any iPhone ever made, as well as many Android phones.

The Petah Tikva-based Cellebrite was reportedly the company the FBI used in 2016 to hack into the iPhone of the San Bernardino shooter after Apple refused the US government’s request to build a backdoor into its famously secure operating system.

The announcement from Cellebrite came in the form of an update this week to its website promoting the iPhone-hacking technology, dubbed “UFED Premium,” as “the only on-premise solution for law enforcement agencies to unlock and extract crucial mobile phone evidence from all iOS and high-end Android devices.”

CASE CLOSED....they can get thru the encryption if they want...they just want to mess with Apple...

No, they don't actually care about Apple at all here. They just target the company because it's convenient, since Apple's devices are both extremely common (meaning shooters and other high-profile criminals are likely to own them) and have fairly solid security.

What they actually want is to set a clear, visible precedent saying that the US government can compel a private company to weaken or defeat their own security at their own cost. Data stored on personal devices probably isn't even the real target here, but it's much easier to argue the case when we're talking about unlocking a terrorist's device than when we're talking about fundamentally undermining an entire service's encryption standards. It's really the latter that's a barrier for them - services have started using encryption that they can't break themselves without some fundamental compromises - and that's why they want this precedent set.

Once the precedent's been set, they can use it to justify FISA warrants and that fundamentally undermines any trust for any data service hosted or operating within the US. So yeah, it looks like this is just some pointless petty bullshit on the surface, but the stakes on these cases are way higher.

That said, the devices out there, right now, are secure (absent Cellebrite). Apple hypothetically changing its stance for the iPhone 13 doesn't magically change how the Secure Enclave works on your iPhone 5/6/7/8/X/11/12/SE.

That isn't really true. If they were compelled to they could make a version of iOS which was insecure on all those devices, and they've demonstrated the ability to do a push update to MacOs, so they probably have the means to do it to iOS too.

Secure hardware can never really save you from insecure software.

iOS has fuckall to do with the iPhone security stack.

I'd say that's the most ignorant thing I read today, but I'm on the Dark matter thread from a few months back and so you're up against stiff competition.

Quote:

All iOS can do is make a request of the hardware - it's all implemented below the OS.

That simply isn't true. The OS can't request the UID, but a lot is being done by the OS. It is the OS that is encrypting the SSD using a PIN tangled with the UID. A subborned OS needn't needn't handle the PIN securely, or the generated key, or anything else. It is the OS which wipes key data after 10 failed unlock attempts. If you think that the FaceId or TouchId sensor data reaches the Secure Enclave without passing through the OS then I think you will be disappointed.

There's a reason why the hardware requires a cryptographically signed OS - and that's because Apple know full well that the OS is a vital part of the security stack.

The infuriating this here isn't that the government is asking these sorts of things.... We all know that they would.

What is infuriating (to me, at least) is the apathy that the public at large has towards privacy concerns and law enforcement.

My partner, every time this subject is brought up, parrots the same old 'if you don't have anything to hide, I don't see the problem with (talking to cops, letting cops search, giving cops information, etc)It's like bashing my head into a wall.

And this is the sentiment that FAR FAR FAR too many people have.. we rail against the 'police state' and it's surveillance goals, but there's a huge throng of people in our society who would willingly just give privacy up for the cost of asking and never give it a second thought... It's those people that scare me, because I can always (at least somewhat) rely on my own encryption and privacy minded actions...

The whole sentiment of 'I have nothing to hide' brings this quotation to mind. "Show me the man and I’ll find you the crime."

But, of course, the people arguing for crackdowns on "terrorism" (however nebulously defined) are usually the same people who have that "You can have my gun when you pry it from my cold, dead hands" bumper sticker from Red Dawn.

To be fair, I don't think that's true. The Edward Snowden revelations got a lot more traction among the Tea Party crowd (which for simplicity I'll equate with 2A advocates, although that's not completely true, either) than on the progressive side.

Around here, the typical gun owner is just itching to use his penis substitute against an invading Mexican or a marauding black thug, and if Trump or his hand-picked AG say they need to turn over their encryption keys, they will happily do so because BROWN PEOPLE ARE DANGEROUS!!!!

Maybe there's just a lot of overlap in my locale between "non-techie" and "gun owner".

You make a critical, exceptionally important point...

If you look a arguments put forth by the government on these divisive topics, what you see is not reasoned argument, but a PR campaign designed to sway the opinions of any large, easily-manipulated demographic group, with the goal of binding that demographic to the government view so that the policy de jour gets accepted. Examples might include:-

1. The technology argument, backed up by the "Four Horsemen of the Infocalypse" (Pedophiles, terrorists, drug dealers and organized crime)...

2. The taxation argument, in which citizens are given a modest, temporary tax cut whilst corporations are given a much larger, *permanent* tax cut... (and never mind that its the citizens that will end up paying the tax hangover costs down the road)

3. The xenophobic argument, in which refugees fleeing wars a rape gangs are branded "terrorists" and "bad hombres", to bind fascists and the alt-right to the cause...

In each case the rhetoric deployed is designed to use the most extreme, negative connotations with the specific intent of provoking a reaction of "something must be done!"

This technique has been pretty well studied in the past: it's known colloquially as the "Problem-Reaction-Solution" method. A government/administration wants to enact a change that they know in advance will be pretty unpopular.

First, get your (potentially unpleasant if suggested in isolation) idea ready and waiting in the wings. Next, create uproar over something. Then, when you get enough public opinion saying, "Something must be done!", simply roll out your pre-prepared solution.

Simples.

If you purchase a product or service from a company that pays corporate taxes...guest what..you paid that tax, too. Don't think for a minute that a corporation's expected tax liability isn't figured into the price of their merchandise/services.

1. If as they claim the FBI found a way in to the San Bernadino shooter's phone without help from Apple, why are they asking for help now?

Well, two reasons:

1) The DOJ wants a backdoor and will play Apple at every turn to try and get it, even if the DOJ has already unlocked the phone.2) This is a much newer phone. The San Bernardino phone was a 5C running iOS 8. That device didn't have nearly as many layers of protection as this phone does, plus, Apple patched every one of the exploits that allowed the San Bernardino phone to be unlocked.

The FBI never released which exploits were used. Apple tried to get their hands on the details in order to plug it, but they were not able to.

I'm sure Apple did have a decent guess about it, and plugged whatever they could, but whether that particular exploit really is plugged is not known.

What Apple did do was plug the hole the FBI demanded Apple use to break into that iPhone. They now require the PIN code before installing new firmware.

This is blow-back 20 years in the making. Post-911, the Bush Administration enacted criminal programs to spy on citizens, thus ending any chance of citizens ever trusting the Feds. It was a moronic administration that never consider long-term consequences of it actions and now the FBI is reduced to begging for help.

While I completely agree with you, these spying programs almost certainly predate 9/11; at best, that was a pretext to put a veneer of legality on it with the "Patriot Act". I believe some Snowden documents showed that the spying systems were already used in February of that year, and there is a good chance it goes back much further, probably decades. Remember Eschelon? J. Edgar Hoover?