Share this story

Cellebrite—the Israel-based forensics company that has been a key source for law enforcement in efforts to crack the security of mobile devices to recover evidence—has reportedly found a way to unlock Apple devices using all versions of the iOS operating system up to version 11.2.6, the most recent update pushed out to customers by Apple. The capability is part of Cellebrite's Advanced Unlocking and Extraction Services, a lab-based service the company provides to law enforcement agencies—not a software product.

Further Reading

But security experts are dubious of any claim that Cellebrite can defeat the encryption used by iOS to protect the contents of Apple devices. Rather, they suggest Cellebrite's "Advanced Unlocking Services" may have found a way to bypass the limits on PIN or password entry enforced by interfering with the code that counts the number of failed attempts—allowing the company's lab to launch a brute-force attack to try to discover the passcode without fear of the device erasing its cryptographic key and rendering the phone unreadable. With a sufficiently secure password, it would be nearly impossible for the technique to recover the contents of the device.

Forbes' Thomas Fox-Brewster reports that a Cellebrite spokesperson confirmed the claim, first found in leaked Cellebrite marketing material, stating that "Cellebrite can retrieve (without needing to root or jailbreak the device) the full file system to recover downloaded emails, third-party application data, geolocation data, and system logs. Agencies can either provide the device already unlocked, furnish the known passcode, or use Cellebrite's Advanced Unlocking Services to unlock the device."

Previous methods for disabling the limits on PIN or password attempts have involved manipulation of the iPhone's hardware. In 2016, Cambridge University computer scientist Sergei Skorobogatov demonstrated that, by removing and mirroring the NAND (flash) memory chip of any iPhone up to the iPhone 6 Plus, he could put in place a replacement memory chip that allowed him to reset the counter for passcode tries. However, hardware changes in the iPhone 5s (with the A7 chipset) and later devices made this sort of attack much more difficult, if not impossible, because of the Secure Enclave Processor (SEP), a dedicated security processor that runs its own operating system and manages the PIN verification. The SEP encrypts the PIN using its unique UID.

Further Reading

Cellebrite is not revealing the nature of the Advanced Unlocking Services' approach. However, it is likely software based, according to Dan Guido, CEO of the security firm Trail of Bits. Guido told Ars that he had heard Cellebrite's attack method may be blocked by an upcoming iOS update, 11.3.

"That leads me to believe [Cellebrite] have a power/timing attack that lets them bypass arbitrary delays and avoid device lockouts," Guido wrote in a message to Ars. "That method would rely on specific characteristics of the software, which explains how Apple could patch what appears to be a hardware issue."

Regardless of the approach, Cellebrite's method almost certainly is dependent on a brute-force attack to discover the PIN. And the easiest way to protect against that is to use a longer, alphanumeric password—something Apple has been attempting to encourage with TouchID and FaceID, since the biometric security methods reduce the number of times an iPhone owner has to enter a password.

"The long and short of it is that your passcode is required to unlock your phone." Guido said. "Cellebrite cannot magically discover your passcode. They can bypass all the counters and lockouts, but, at the end of the day, they need to brute force your passcode. It can be easy, if you don't have one set or it is only four digits, or it can be difficult, if you set a complex passcode with letters and numbers. As long as your passcode is a sufficient length, then Cellebrite will spend forever trying to brute force it without success."

Share this story

Sean Gallagher
Sean is Ars Technica's IT and National Security Editor. A former Navy officer, systems administrator, and network systems integrator with 20 years of IT journalism experience, he lives and works in Baltimore, Maryland. Emailsean.gallagher@arstechnica.com//Twitter@thepacketrat

144 Reader Comments

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

"Apple uses several that are burned into the silicon of the Encryption Engine in the Secure Enclave Processor. What keeps the encryption difficult is the four separate pieces of the encryption KEY that is constructed.

The passcode of the user is used to generate a one-way hash, which is then used as a portion of the key.A device Group ID (GID) that does remain the same for each unit in that product model burned into the silicon. This is the only part that is known.A Random Unique ID (UID) that is burned into the silicon when the chip is made and not recorded anywhere.A truly entropic random number environmentally sampled from the device's microphone, cameras, GPS sensors, position sensors, barometer, accelerometer, etc., at the time the user first inputs his/her passcode and stored in the Secure Enclave's EPROM.All four of these are used by one of the algorithms to be entangled in a way that can be recreated each time the passcode is entered. This results in an at least 144 character KEY which includes every possible character in the Apple 223 character set.This all happens inside a sealed read only area of the Secure Enclave that has access only to the limited things it is allowed to access. The main processor cannot access anything inside the Secure Enclave. . . there is no hardware read/write access from that main processor to that area so it cannot read those hidden pieces of data to provide them to anything outside and it can only receive data that is deliberately sent to it by the encryption engine. Even a hardware testing device cannot get access to read what's in it.

There are four inter-registered ICs in each Apple device that once removed from the device MUST be reregistered with each other before the device will be able to reboot and the Secure Enclave to operate correctly. This is designed to prevent anyone from dismounting the chipsets and attempting to insert the system into a super-computer to do a fast work-around of the lock-out protocols, or to attempt a shaving technique. Essentially any break-in has to be done on the iPhone itself."

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is why you hit your power button 5 times in rapid succession as soon as cops are about.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Also if the Perp just looks away it will fail. Face ID will only unlock if your eyes are looking at it. After a few failed attempts, Face ID is shutdown and the password is required to re-enable it.

"Apple uses several that are burned into the silicon of the Encryption Engine in the Secure Enclave Processor. What keeps the encryption difficult is the four separate pieces of the encryption KEY that is constructed.

The passcode of the user is used to generate a one-way hash, which is then used as a portion of the key.A device Group ID (GID) that does remain the same for each unit in that product model burned into the silicon. This is the only part that is known.A Random Unique ID (UID) that is burned into the silicon when the chip is made and not recorded anywhere.A truly entropic random number environmentally sampled from the device's microphone, cameras, GPS sensors, position sensors, barometer, accelerometer, etc., at the time the user first inputs his/her passcode and stored in the Secure Enclave's EPROM.All four of these are used by one of the algorithms to be entangled in a way that can be recreated each time the passcode is entered. This results in an at least 144 character KEY which includes every possible character in the Apple 223 character set.This all happens inside a sealed read only area of the Secure Enclave that has access only to the limited things it is allowed to access. The main processor cannot access anything inside the Secure Enclave. . . there is no hardware read/write access from that main processor to that area so it cannot read those hidden pieces of data to provide them to anything outside and it can only receive data that is deliberately sent to it by the encryption engine. Even a hardware testing device cannot get access to read what's in it.

There are four inter-registered ICs in each Apple device that once removed from the device MUST be reregistered with each other before the device will be able to reboot and the Secure Enclave to operate correctly. This is designed to prevent anyone from dismounting the chipsets and attempting to insert the system into a super-computer to do a fast work-around of the lock-out protocols, or to attempt a shaving technique. Essentially any break-in has to be done on the iPhone itself."

If this is true, these Israeli guys must be pretty damn clever!

Thanks for this. This is extremely interesting.

I’m impressed at Apple’s engineering here. It would be interesting to see an article focusing on the details of how all of this works.

I actually feel better about my iPhone purchase switch from Android a few years ago after reading this.

They COULD be inhibiting the attempts count, they could also be extracting the hashed password and salt from the SEP and brute-forcing a solution without involving the phone at all.

As far as I know, the hash is not stored anywhere, and the salt is useless - it's not considered a secret under any security model I ever heard of.

Your password is verified when it successfully dectrypts the data. There is no hash of your password.

Quote:

But, that's mostly beside the point. If they can get the device into a state that they CAN guess solutions quickly there isn't a reasonable password length that will stand up to brute-force guessing for very long.

They can't guess solutions quickly. It has to be done by the chip on the phone which has unreadable memory that is part of the decryption key, and that chip is deliberately under powered - it can only make a certain number of guesses per second.

If a use has a six digit passcode, then even with the slow chip a brute force won't take too long. But if they have a strong alphanumeric password then forget about it.

Six digit passcodes are supposed to be protected by the chip getting even slower after repeated failures. Instead of just being hampered by a weak processor, it has an oboard clock that has to reach a certain time before it will allow another guess. Speculation is that this system has been bypassed - it is a known vulnerability since it's done in software rather than hardware.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

iCloud backups are not encrypted* and Apple routinely does comply with warrants to access the data.

You can't encrypt any mass-market backup solution. If somebody forgot their password they'd never be able to restore their data... it's a feature only niche power user back solutions offer.

(*) technically they are encrypted, but Apple has the key - which is stored in a secure location that will only ever be accessed in response to a warrant.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is why you hit your power button 5 times in rapid succession as soon as cops are about.

(disables biometrics)

That's the old way. In iOS11 you hold power and Vol up or down for 2 seconds.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is why you hit your power button 5 times in rapid succession as soon as cops are about.

(disables biometrics)

That's the old way. In iOS11 you hold power and Vol up or down for 2 seconds.

Both work with the same result. I just tested on my iPhone 8 running 11.2.6.

Adding to Abhi Beckert's posts above regarding the unlocking method, all signs are that this is still a brute force method involving making copies of the memory chips and brute forcing it. This seems like the same method that was previously used, with tweaks.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is why you hit your power button 5 times in rapid succession as soon as cops are about.

(disables biometrics)

That's the old way. In iOS11 you hold power and Vol up or down for 2 seconds.

Power button 5 times (which i can do in 1-2 seconds) still works in IOS 11. I did it just now to confirm before posting.

I just tried power+volume up (and power+volume down for good measure) on an iphone 7 (11.2.6) and it doesn't work. Biometrics still work.

Spamming power button is faster anyway.

edit:as per below, probably an iphone 8/X vs. 7 or prior thing.

Either way. point being, simple measures to disable biometrics exist. Learn how to use them on your device of choice.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is why you hit your power button 5 times in rapid succession as soon as cops are about.

(disables biometrics)

That's the old way. In iOS11 you hold power and Vol up or down for 2 seconds.

Both work with the same result. I just tested on my iPhone 8 running 11.2.6.

The iPhone 8 and iPhone X have the new way (the 5 taps is now for Emergency SOS).

The hold power+any volume is nice because you get a vibe indicating it worked.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

iCloud backups are not encrypted* and Apple routinely does comply with warrants to access the data.

You can't encrypt any mass-market backup solution. If somebody forgot their password they'd never be able to restore their data... it's a feature only niche power user back solutions offer.

(*) technically they are encrypted, but Apple has the key - which is stored in a secure location that will only ever be accessed in response to a warrant.

There’s a difference between being legally compelled to do something and being technically capable.

A warrant has to specifically instruct Apple to decrypt the iCloud backup; requesting information about a iCloud account or Apple user doesn’t legally compel them to comply by decrypting an iCloud backup.

That being said, if the police or FBI want my pics and contact info, my random Notes and to see my highest Sudoku score, they can jump through the hoops and be sorely disappointed all they want.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

If you click on the on off switch five times, you must use the passcode to unlock the phone. If your phone is powered off, you must use the passcode. Neither Face ID or Touch ID will work.

Also with Face ID, you must look purposefully at your phone. Side glances don’t do it. Close your eyes, and it won’t unlock. It’s not too easy to get someone to unlock their phone if they don’t want to.

Before Touch ID, I had a four digit passscode. With Touch ID, I use a short alphanumeric passcode. Because I don’t have to enter it every time I use my phone. However, I still find myself having to enter my passcode about two or three times per day because if your hands aren’t perfectly dry, Touch ID won’t work.

If I had Face ID, I’d probably switch to a long alphanumeric passcode since Face ID works almost 100% of the time.

Now, you tell me which is more secure? A system that encourages users to use a four digit passcode (and maybe none at all) or one that encouragrs users have a long random alphanumeric passcode?

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

Apple holds the encryption keys to all iCloud backups, and has given this to law enforcement many times in the past. Apple has been pretty open about this, not sure where you got the idea that it’s encrypted with a key not known by Apple. Encryption only works if the other party is not the one holding the encryption key. It’s good to know these things before blindly trusting companies with your personal data

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

Actually, the Supreme Court has ruled that warrant-less searches of a suspect's phone are illegal.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is wrong on all counts. Except in a few cases, a search warrant is required, at least in the USA. Without one, the search (unlocking the phone with your biometrics without your explicit consent) would be in violation of your civil rights, and any evidence obtained would be inadmissible. So, no law enforcement officer with half a brain* would try that, at least not these days.* Yes, I understand many of them have less than half a brain, but there are still good ones out there.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Honestly, I’ve found that simply letting the phone out of my possession, it being handled by others, is usually enough to disable FaceID due to failed attempts on other faces. Basically what happened in the demo at release where FaceID “failed”.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Also, if the phone has not been unlocked with biometrics in the past eight hours, biometrics will be automatically disabled until you enter the passcode.

Others have already mentioned immediately disabling biometrics by pressing the sleep/wake button and either of the volume buttons on the other side of the phone simultaneously.

A Random Unique ID (UID) that is burned into the silicon when the chip is made and not recorded anywhere.

I just wanted to point out that as of iPhones with the A9 and above, the UID for the Secure Enclave is now generated by the SE on the first boot of the device and is wholly unique to that SE. Mitigating supply channel attacks that attempt to obtain the UID of the Secure Enclave and its secrets at the time of manufacture.

Data security on iOS devices is on an entirely other level from any other consumer device, and its constantly improving, the A11 introduced more anti-replay countermeasures in the SE.

I'm pretty sure if somebody has infinite budge to get into the phone, there's ALWAYS the option of decapping the ICs and trying to read them directly. Error prone and insanely costly though.

As far as I know decapping an IC has an unacceptably high risk of destroying part of the data that is necessary to determine the decryption key.

It's an absolute last resort which is unlikely to succeed and will make the data permanently unreadable when it fails.

... yeah? That's what I said. I don't know if I'd go so far as "unlikely," but it's certainly not something you do to every phone.

Ars commenters leave me confused a lot of the time. What on earth made my version of this offensive, but his brilliant? Was it because I'm sad about RIM? We're allowed to be sad about failed tech companies, people.

Except a law enforcement officer that wishes to unlock an iPhone X need only hold it up to a hand-cuffed perp and see if it unlocks. Nothing biometric is considered "secret" and actually accepting ownership of the device or not is rendered moot if your mug is the one that unlocks it.

This is true if the device is powered on , and it's also true for devices with TouchID under the same circumstances. But if the device has been powered off, the passcode is required again before initial boot.

Which is why I turn my phone off every time I get pulled over, and set my failed attempts wipe to 3.

iCloud can always restore my phone.

I have literally nothing a law enforcement person would want on my phone, but I’m not going to make it easy for them.

Edit - Just realized I can’t set the attempts value before wipe. I swear this was an option in previous versions of iOS.

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

Apple holds the encryption keys to all iCloud backups, and has given this to law enforcement many times in the past. Apple has been pretty open about this, not sure where you got the idea that it’s encrypted with a key not known by Apple. Encryption only works if the other party is not the one holding the encryption key. It’s good to know these things before blindly trusting companies with your personal data

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

Apple holds the encryption keys to all iCloud backups, and has given this to law enforcement many times in the past. Apple has been pretty open about this, not sure where you got the idea that it’s encrypted with a key not known by Apple. Encryption only works if the other party is not the one holding the encryption key. It’s good to know these things before blindly trusting companies with your personal data

quoting an article from Feb. 25th 2016 is cute and all but I believe your information is out of date.

I know for a fact that mac backups created on an encrypted hard drive are impossible for apple to open. You have to create a special emergency key if you get locked out. I don't know how much less secure iphone backups are but you have the option to never backup into the cloud. It is not difficult and you can still have local backups on your encrypted computer.

It's not out of date, I follow these things closely. If you want to make unsubstantiated claims, you should back them up. If you want something more recent, here's a link to Apple's documents:

If they were determined enough they could just ask Apple for your iCloud backup.

You mean they could get a warrant with which Apple could not comply even if they wanted to.

Encryption is nice like that.

Apple holds the encryption keys to all iCloud backups, and has given this to law enforcement many times in the past. Apple has been pretty open about this, not sure where you got the idea that it’s encrypted with a key not known by Apple. Encryption only works if the other party is not the one holding the encryption key. It’s good to know these things before blindly trusting companies with your personal data

1) Encryption someone else can unlock might be inherently insecure, but is still technically encryption. There isn’t even a way to argue this - it’s still encrypted.

2) I don’t store anything anywhere of any importance that Apple or anyone else could ever access.

3) I’m okay with only Apple having the key to my iCloud encryption because I’m aware of it and the things stored there are highly unlikely to be compromised, and if they are pose no other risk to me.

4) To provide iCloud details Apple has to be specifically warrant-requested to do so. I have zero situations in which this might happen and if it did - again, there’s nothing to be found.

Feds and cops can have my music history, contacts, notes and docs - it’s fundamentally worthless to anyone else.

For your first point, I never claimed it's not encryption, just that it's useless in this context since Apple has the keys. For the rest of your points, that's fine for you, but you and other people should be aware of it. Other people care much more about their privacy, especially when mobile devices often contain more personal data than our computers do.

Why would anyone with sensitive or incriminating data back up to iCloud? Are hard disks too expensive these days?

Not just sensitive information, generally anyone who cares about their privacy should back up their device locally. Backups for iOS devices don't even require much disk space since the OS and apps aren't included in the backups, only the data is.