from the law-enforcement's-access-hole-is-everyone-else's-security-problem dept

The FBI lost control of the "going dark" narrative. Part of it unraveled thanks to outside vendors. Two vendors -- Cellebrite and Grayshift -- announced they could crack any iPhone made. This shot holes in the FBI's theory that locked phones stayed locked forever and thereafter were only useful for hammering legislators over the head with until they cranked out an anti-encryption law.

The second unraveling was the FBI's own unforced error. Supposedly it couldn't count phones without software and the software it had couldn't count phones. What the FBI and others claimed was 8,000 uncrackable threats to the safety of the American public was actually a little over 1,000 phones. As for the latent threat posed by these locked devices, that's still pure speculation until the FBI starts handing over some info on what criminal acts these phones are tied to.

Apple is closing the technological loophole that let authorities hack into iPhones, angering police and other officials and reigniting a debate over whether the government has a right to get into the personal devices that are at the center of modern life.

Apple said it was planning an iPhone software update that would effectively disable the phone’s charging and data port — the opening where users plug in headphones, power cables and adapters — an hour after the phone is locked. While a phone can still be charged, a person would first need to enter the phone’s password to transfer data to or from the device using the port.

Law enforcement may be angered by this but private companies are not obligated to make law enforcement's job easier. Apple's official statement on the software update is probably meant to be placating, but is unlikely to change the mind of any law enforcement official who sees this reaction to phone cracking devices as another extended middle finger from tech companies. According to Apple spokesman Fred Sainz, this fix is being issued to fix a security hole, not "frustrate" law enforcement efforts.

But law enforcement efforts will be frustrated. The same goes for criminal efforts. Any device that can crack any iPhone exploits a flaw in the software or hardware. There's no such thing as a security hole that can only be exploited for good. Grayshift's GrayBox could end up in the hands of criminals and it may well be that both vendors have already sold tech to law enforcement agencies in countries where civil liberties aren't as valued as they are in the United States.

The article quotes several law enforcement officials complaining about being locked out of iPhones again. And while the frustration is understandable, the fact is plenty of data and communications are stored in the cloud, untouched by device encryption. Generally speaking, companies like Apple and Google have been cooperative when approached directly by law enforcement, as long as the request doesn't involve breaking device encryption.

This isn't the end of the discussion. Nor should it touch off another skirmish in the Encryption War 2.0. This setback should be viewed as temporary. Holes with be found and exploits deployed and these will be met with patches and firmware upgrades by the tech companies affected. This all can be traced back to the earlier days when it was only criminals looking for ways to defeat personal security measures. Law enforcement was late to the game, but its arrival shouldn't mean companies forgo protecting their customers to avoid inconveniencing the government.

Reader Comments

in countries where civil liberties aren't as valued as they are in the United States

I challenge you to name a country where its people do not value civil liberties as much as any other. To say the US government values civil liberties is disingenuous.

On topic:

Apple is closing the technological loophole that let authorities hack into iPhones

This is a little politically charged as the real purpose is to block a vulnerability that let bad actors get into locked phones. That it also stymies law enforcement is nothing more than a side effect. I'm glad for it either way.

Re: Re: Re:

I was not aware of any "hiding" from the US either.

The US professes to respect civil liberties when speaking abstractly and sometimes invents elaborate explanations for how some action is not technically a violation, even when a common understanding of the law says it clearly is a violation. This persists no matter how often the courts fail to follow through when presented with specific instances of civil liberty violations that could be addressed by the court.

The derided countries do not even pretend to respect civil liberties. The US has processes to accept and discard complaints about abuse by officials. The other countries simply don't accept the report in the first place.

It's more palatable for the US...

...for companies to push consumer protections in China than it is to push consumer protections for Americans. Then it becomes embarrassing if US phones not as up-to-date as Chinese phones, so we get the same updates as an afterthought.

Re:

Here are 10 countries where people have a lot of other things to worry about. Human attention is a zero sum game. When you're worried about starvation and barrel bombs from the sky, your concern for civil liberties tends to wane somewhat.

Re: Re:

Also disingenuous. All of those people care about their civil liberties just as much as anyone else. Yes, they have more important things to worry about but that doesn't diminish their want of civil liberties.

The real pity is we keep accepting narratives without demanding evidence.They have no idea how many demon phones they have.They have no list of investigations.They have no list of bad guys walking away for lack of phone data.They have no white knight story where they saved a kidnapped puppy solely because data on a demon phone.They have no list of terrorist plots stopped b/c they got a phone open.They have bullshit & conjecture that really bad things(tm) are happening because these phones are locked.They trot out the number of terror attacks they've stopped without cracking phones, but really don't want to own up to having created those plots to exploit the mentally challenged to get more funding.They have managed to miss several 'terrorist' attacks b/c the actors weren't wearing the right hat & brown skin as called for in the movie rules.Society is losing billions of actual dollars in ID theft, but it is much less important than stopping imaginary terror plots, b/c fscking soundbites control the budget & we only fund boogeymen hunting.Stupid companies leak out data left & right and hey billions in fake income tax returns get filed... and they chase the person who was ripped off not the bad actors.Price fixing is rampant in several categories, yet there is no time to enforce those rules.

Society has spent stupid amounts of money to get into some of the demon phones, don't we deserve to know the benefit gains by spending millions to unlock the phone of someone busted for having a joint?

Re:

--They have no list of investigations.Yes we do, Terrorist investigations!Terrorists are everywhere just waiting to kill you!

--They have no list of bad guys walking away for lack of phone data.Yes we do, all those terrorists hiding in the shadows waiting to kill your granny with a bomb!

--They have no white knight story where they saved a kidnapped puppy solely because data on a demon phone.Exactly, because we can't get into the dang phones!Imagine all the people we could have saved if we could!

--They have no list of terrorist plots stopped b/c they got a phone open.Again, thats true because we cannot get the phones opened!

--They have bullshit & conjecture that really bad things(tm) are happening because these phones are locked.Look at all those terrorist attacks of the past.If we could get into the phones we could catch em all!

Re:

1. Their motto used to be "don't be evil". "Do no harm" is the hippocratic oath. Either way, they changed that motto some time ago.

2. Google only make a tiny fraction of Android phones. The rest can be, and often are, heavily modified beyond Google's control. As is the nature of an open source project, nobody can dictate that the code has to remain as Google supplies it. It's a completely different situation to Apple's proprietary system.

Throwing the villiage under the bus.

What about that first hour?

Apple said it was planning an iPhone software update that would effectively disable the phone’s charging and data port — the opening where users plug in headphones, power cables and adapters — an hour after the phone is locked.

This is a suspicious statement. Why should simply having the port enabled cause a security problem? If it does, why not fix that instead? And why not disallow it right away, instead of leaving a gaping hole open for a full hour (or forever if someone forgot to lock it)?

a person would first need to enter the phone’s password to transfer data to or from the device using the port.

So... the current design is that anyone can transfer data to and from a phone without any kind of authentication? That's a bad design. It's not like it would be hard to require the phone and the remote device to be paired, or require a password to be entered, before allowing it.

Re: What about that first hour?

The security problem is that the port allows access to the data on the phone which includes phone records, text history, apps installed, data stored for those apps, etc, etc, etc. By closing the port they fix the problem as none of that data should be available from a phone that is locked. That's kinda what "locked" means.

Re: Re: What about that first hour?

By closing the port they fix the problem as none of that data should be available from a phone that is locked.

Clear enough but I'd call that a workaround. If there's a port that allows full access without any authentication, disable the thing! And right away, not only after being locked for an hour! But then go fix it for real by requiring authentication.

Simply being enabled should not mean it's vulnerable. That's like the "firewall culture" that caused so much trouble on the internet. "Sure, WinNuke will crash Windows 95, but why not just add a firewall to block the SMB port?!"

Re: Re: Re: What about that first hour?

Note that even when unlocked, or shortly after locking, people are going to connect to public charging stations etc. A good design would prevent these from accessing the phone's data (without people having to use data-condoms).

Re: Re: Re: What about that first hour?

It's interesting that this isn't already part of the phone. Considering that when I plug my iphone into my computer I can't transfer data unless I "approve" the connection on the phone I have to wonder exactly how data is reportedly available without that step. Perhaps that's what they've done by "locking" this port: Applied the same level of security to all data transfers.

Re: Re: Re: Re: What about that first hour?

I believe that the authentication step is iTunes handshaking with the phone. Software built to bypass that handshake would still be able to access data from an open port.

Put it this way - if your building has a security guard and legitimate visitors always have to sign in at the front desk, that doesn't mean someone won't be sneaking past when he's not looking if the door is always unlocked.

Re: Re: Re: Re: Re: What about that first hour?

Software built to bypass that handshake would still be able to access data from an open port.

Why? You don't see that as an enormous bug in the phone?

Put it this way - if your building has a security guard and legitimate visitors always have to sign in at the front desk, that doesn't mean someone won't be sneaking past when he's not looking if the door is always unlocked.

This analogy isn't very useful. Apple designs the iPhone SOCs and USB stack, and the software on the other end. They don't just get to say "yes" or "no" to each device, they could use cryptography. Anyone "sneaking" past should have improper encryption keys and therefore be rejected—IOW, treat the USB connection as you'd treat the internet connection. There are sometimes bugs, but if I can crash an iPhone over its internet connection Apple wouldn't just disable the connection an hour after the phone is locked—they'd fix it.

Re: Re: Re: Re: Re: Re: What about that first hour?

Not really, but there are always ways to bypass such handshake methods, as the protocols can be reverse engineered and bypassed, faked, intercepted, etc.

"They don't just get to say "yes" or "no" to each device, they could use cryptography"

They probably do. But, again, such things can be reverse engineered, etc. "Cryptography" isn't a magic spell to protect things, it can be compromised. That's why added layers of security by design are always a good idea - such as an attack on the encrypted traffic being made much more difficult if they only have one hour to attempt it vs. infinity.

"treat the USB connection as you'd treat the internet connection"

You... do realise that internet security can also be compromised, right?

"There are sometimes bugs, but if I can crash an iPhone over its internet connection Apple wouldn't just disable the connection an hour after the phone is locked—they'd fix it."

Sigh. First of all, Apple IS fixing the known exploits that are being used right now. Don't act as if they're doing nothing. This is simply an extra layer of security to make future exploits more difficult. They simply have to make the compromise between total security (shutting off everything immediately) and usability (people very often continue doing things legitimately while the phone is locked).

Re: Re: Re: Re: Re: Re: Re: What about that first hour?

Sigh. First of all, Apple IS fixing the known exploits that are being used right now. Don't act as if they're doing nothing.

Excellent if true, and that was the point: fix any known bugs, don't just disable an interface to prevent their exploitation (disabling is still useful as an additional measure, especially if they don't know the exploit being used).

The story we're commenting on doesn't say anything about this. It says they're going to disable to port to transfer data after it's been locked for an hour, implying unauthenticated transfer will remain possible before then. I'd be interested to see a link that provides more technical detail.

Re: Re: Re: Re: Re: Re: Re: Re: What about that first hour?

"that was the point: fix any known bugs, don't just disable an interface to prevent their exploitation"

They're almost certainly doing both, which is better than just fixing the bugs and leaving open the wider window for further exploits to be found and used. What, exactly, is your problem with this other than your baseless assumption that Apple doesn't want to bother fixing security bugs?

"The story we're commenting on doesn't say anything about this."

Why would it? I'd assume that even Adobe and Oracle with their horrible track records would still be fixing bugs when they announce new security measures, why wouldn't Apple? The default assumption would be that they will continue to fix bugs, and their next set of release notes should tell you which ones.

"It says they're going to disable to port to transfer data after it's been locked for an hour, implying unauthenticated transfer will remain possible before then"

Yes, again, due to compromise. They cannot disable the port immediately and leave the phone in any kind of usable state. They also cannot completely guarantee that other security measures will not be bypassed in some manner, so they're putting in another security measure on top of those that already exist.

I see no problem with this, unless you're one of those people who has to pick fault with anything Apple-related just because.

Re: Re: Re: What about that first hour?

"disable the thing! And right away, not only after being locked for an hour!"

Given that the same port is used for headphones, that would be extremely annoying very quickly. In fact, I'm hoping it's implemented intelligently enough to understand that it shouldn't be disabling itself while music is playing and headphones are plugged in, though of course that could potentially be used as an exploit.

"Simply being enabled should not mean it's vulnerable"

Everything that's enabled is vulnerable. If you don't understand that, you have no business being in a discussion about security.

Re: Re: Re: Re: What about that first hour?

Given that the same port is used for headphones, that would be extremely annoying very quickly.

Headphones don't need to "transfer data", other than the very limited case of audio data. Anything that wants to access the phone's filesystem should be setting up an authenticated encrypted linked.

Everything that's enabled is vulnerable. If you don't understand that, you have no business being in a discussion about security.

In theory that's not true: a properly implemented system can be secure. Even when "locked" the port's still "enabled" for charging, and presumably not vulnerable (or less vulnerable). By your logic, the phone should also disable all network access while locked.

You're talking about defense in depth. It's a good idea but not a substitute for proper security. As you note, overzealous implementation can cause trouble (disabling headphones).

Re: Re: Re: Re: Re: What about that first hour?

"Headphones don't need to "transfer data", other than the very limited case of audio data"

So... they transfer data? If any transfer is possible, there remains the possibility of an exploit.

"In theory that's not true: a properly implemented system can be secure"

Define secure. If you mean "not likely to be hacked" or "hacking is so cumbersome that it's very unlikely anyone will bother", then sure. If you mean "not possible to be hacked", then no. It's an old adage that the only system that's truly unhackable via the internet is one that's not connected to it, but even air gapped systems have ways to be exploited to the determined and motivated hacker.

"Even when "locked" the port's still "enabled" for charging, and presumably not vulnerable (or less vulnerable)"

I would presume that power and data are transmitted differently, but yes it's still potentially vulnerable, although less so than when only power is being utilised.

"By your logic, the phone should also disable all network access while locked."

No, that's your silly interpretation. I'm saying that you shouldn't pretend it's not hackable while it can be accessed by whatever means. An accessible system is an exploitable system, if the right methods are found. If you want to take idiotic steps to prevent that, that's on you.

Real security hole.

I think Apple should be held liable for such horrible security practices. After all, unlike other security holes this one is easily seen by the human eye. And it has been purposely installed on every one of their devices. Shame on them. Remove the headphone port but leave the large hole for security

Re: Real security hole.

If you owned an iphone you'd know none of that is true. You already have to approve a data connection to your unlocked phone. Any out-of-band data transfer that may be possible (and apparently is) is unlikely to be intentional.

Well duh...

If there's a box that allows you to open a locked phone, it's only a matte of time before criminals have one too. (No policeman would sell one to the mob, would they?) Anything Apple does to discourage the market in stolen iPhones, anything that stops them being turned into useful iPhones is a good idea.

Now all we need is a "jerk-off" app or iOS option. When your phone experiences sudden acceleration - such as when it is ripped out of your hand by someone running past - that "jerk" motion should trigger an "off" command so the phone locks instantly. (Or at least a demand to re-enter the security code).

Security is not always about the police. They should get over themselves.

Re: Well duh...

Now all we need is a "jerk-off" app or iOS option. When your phone experiences sudden acceleration - such as when it is ripped out of your hand by someone running past - that "jerk" motion should trigger an "off" command so the phone locks instantly.

Wrong word, wrong tense

There's no such thing as a security hole that can only be exploited for good. Grayshift's GrayBox could end up in the hands of criminals and it may well be that both vendors have already sold tech to law enforcement agencies in countries where civil liberties aren't as valued as they are in the United States.

Not 'could', 'has'. For something that valuable, able to break into even secured iDevices, you can be absolutely sure that any number of criminal groups spent significant resources getting a copy for use.

The question at this point isn't 'Do they have it?', but 'How many of them have it?', because I can all but guarantee that that number is higher than zero.

Isn't that the One Ring rule?

We had so many talks of unicorn keys that only worked for good-guys (or against bad-guys) but it started to smack of the Texas Marksman fallacy where the easiest way to make bullets that only kill bad-guys is to define a bad-guy (as one subset of many within the set of bad-guys) as someone who is hit by those bullets.

The thing is, anytime we make a super weapon like a universal backdoor key or the NSA mass surveillance program, or a nuclear arsenal, someone malicious will sooner or later get control of it and use it for personal gain.

Grayshift's new penetration

Excellent. By announcing victory, they're reporting the vulnerability, which means Apple will need to address and fix the vulnerability itself rather than merely making the vulnerability harder to exploit.

By announcing their victory as a counter-electronics service, they're actually doing service as a white-hat.

Now, Apple simply needs to respond to it as if it were a reported vulnerability.

Re: Re: Grayshift's new penetration

Well, I was thinking Apple would respond as appropriate to a company that actually made operating systems and fix the vulnerability and even thank Grayshift for reporting it.

Microsoft pays bounties. I don't know what Apple's policies are though considering they sell OSes with an air of superiority over Windows, I'd think they'd do the same. Maybe even have better bounties to show they're even more concerned about end-user security.

Usually it's third parties who use OSes that like to sue white-hats for exposing vulnerabilities.

'Thanks for the head's up, pity about your business.'

Well, I was thinking Apple would respond as appropriate to a company that actually made operating systems and fix the vulnerability and even thank Grayshift for reporting it.

My comment was mostly in jest, however I would love to see them do this actually as if they can manage to fix the problem for good then Greyshift would have basically put themselves out of business with their boasting, so Apple thanking them for pointing it out would be hilarious.

Cultural Clusterfuck Stockholm Syndrome

Historically, We didn't invent building safety codes until being crushed or burned alive by our own homes became a common and reasonable, undeniably "legitimate" FEAR; One should consider, if the aristocracy could have leveraged the shoddy construction of the average home for EVEN A FRACTION of the advantage that modern computers could theoretically provide it's masters- Would we have ever even progressed to the 'Idea' of building safety standards? Or would they have quietly eliminated the possibility of safe homes, as a threat to their sovereignty?

If you don't understand the connection I'm making, try finding a modern device without ring -3 hardware...

The war on general purpose computing might be going allot better if people could figure out what side is OURS...