In a move sure to complicate the ongoing Apple vs. FBI court case, Apple is reportedly developing stronger iCloud encryption methods that would prevent even it from accessing and extracting user data protected by a passcode.

Racks of Apple's iCloud servers in Maiden, NC

Citing sources familiar with Apple's plans, The Wall Street Journal on Tuesday reported that while preparations for a more secure iCloud are underway, executives are still trying to find a workable balance between strong encryption and customer convenience.

Currently, iCloud can be configured to store daily device backups, messages, photos, notes and other data, much of which is accessible by Apple. But the purported plan is to encrypt that data and restrict access to holders of user-created passkeys. Apple's supposed encryption plans were first report by the Financial Timesin late February.

If Apple does enact stronger iCloud security measures, particularly those that would render warrants for data access moot, it could exacerbate an already tenuous situation. The company is embroiled in a heated court battle over the unlocking of an iPhone used by San Bernardino shooter Syed Rizwan Farook. Apple was compelled by a federal magistrate judge to help in FBI efforts to break into the device, but the company has so far resisted, sparking a contentious debate over privacy rights and national security.

Currently, iCloud backups are Apple's go-to, non-destructive option for law enforcement requests. And for agencies like the FBI, iCloud has quickly become the only way to access data as part of a criminal investigation.

Apple introduced strong on-device encryption with iOS 8, making it nearly impossible to extract usable intel from hardware running the latest OS. Certain information, however, is sent up to the cloud and can potentially be accessed by Apple on behalf of the government. That all ends if Apple puts encryption keys wholly in the hands of its customers.

A version of this all-or-nothing strategy is already up and running in iCloud Keychain. The feature lets users store especially sensitive data like passwords and credit card information that can be accessed remotely, synced and transferred to and from other devices. Apple is unable to decrypt and read the data, but at the same time it can't restore or retrieve the information if a user loses or forgets their password.

It remains unclear when Apple intends to implement the iCloud changes, if at all. However, given the company's intractable stance on strong encryption, consumers could see enhancements roll out sooner rather than later.

We can cheer Apple on, but if the government wins and decides to fine Apple per iPhone for not being able to circumvent encryption, this could cause Apple to price the iPhone out of most people's reach.

The biggest setback would be other countries not trusting Apple anymore.

It isn't hard to do this. Just set it to be encrypted based off a passphrase you invent.

Problem is....

People choose bad passphrases, which can be easily brute forced offline since apple will still give them the data. People forget them. And if they forget, you lose - everything.

Of course, one could forget the password but still have their iphone. Remake the password and resync. If you lose your device though, you are screwed if you forgot your "key."

Might blow some peoples minds here but Google already does this with Chrome Sync and Mozilla with their sync. They can't read your browser history you sync with them if you pick a passphrase/password. Even if that password is the same as your main account, using it still doesn't grant them access.

They'll have to move iCloud servers outside the US if they want real security with strong encryption.

Yes, I see your point BUT which country will protect ones digital data. Will that country eventually require Apple to surrender the user data. Very tricky here.

Why not just essentially torrent the crap out of it, meaning bitslice it accross 10+ countries.... Only the client knows where the files really are. Each file could even have it's own set of countries... Make it a jurisdictional nightmare to recover, not just an encryption nightmare.

Since you have to have access to the phone to know where the files are, that really ups the ante on recovery :-). Obviously, if you lose your phone your really really screwed, unless you can back the recovery directory in a locally encrypted cache with a long passcode (not the same as your phone hopefully)..No one knows where the files are but you :-)..

It isn't hard to do this. Just set it to be encrypted based off a passphrase you invent.

Problem is....

People choose bad passphrases, which can be easily brute forced offline since apple will still give them the data. People forget them. And if they forget, you lose - everything.

Of course, one could forget the password but still have their iphone. Remake the password and resync. If you lose your device though, you are screwed if you forgot your "key."

Might blow some peoples minds here but Google already does this with Chrome Sync and Mozilla with their sync. They can't read your browser history you sync with them if you pick a passphrase/password. Even if that password is the same as your main account, using it still doesn't grant them access.

Yep... it all comes down to the people using this stuff.

LastPass says in big letters "Choose a long master password. Do not lose your password. We do not know your password. We cannot give it to you if you forget it..."

I'm sure Apple could do something similar.

But yeah... the burden is still on the users to be responsible... and that doesn't always happen.

They'll have to move iCloud servers outside the US if they want real security with strong encryption.

Why? Let's say Apple gives you the option of using a pass phrase that isn't stored anywhere on Apple servers or even on your iPhone (well, it IS stored on your iPhone, but only a hash is stored, not the original pass phrase).

Whenever your iPhone connects to iCloud to backup data, it firsts get encrypted locally on your iPhone (using your pass phrase) and then gets sent for backup. The iCloud servers would be storing already encrypted data. If someone got hold of this information it would be useless as they don't have the pass phrase to decrypt it (they'd have to brute force decrypt it, which could take some time depending on the pass phrase you chose).

Apple could even take this further. The "username" the data gets stored under isn't your actual Apple ID, but a token that's an encrypted representation of your Apple ID hashed with your iCloud pass phrase. So even Apple employees themselves wouldn't be able to link a set of iCloud data to a specific user or device (like they can do now, presumably because there's an identifier linked to your Apple ID and device).

Hell, let's forget pass phrases altogether. We know that Apple takes your fingerprint and stores a mathematical representation (a hash) of it in the secure enclave, in a format that makes it impossible to reverse back to the original print. What makes it so impressive is you can use a PORTION of your finger at different angles or orientations on the fingerprint sensor and Touch ID still somehow manages to "match" this with the stored hash of your fingerprint. So why not make a fingerprint as the pass phrase that encrypts your iCloud data? Now you can't forget it, and because of the complexity of your fingerprint you'd have the equivalent of a very long pass phrase (impossible to decrypt).

I don't think security rides so much on WHERE the servers are, but on HOW the data is stored. They could be treated as "dumb" servers who simply store information sent to them, without actually knowing anything about the data itself or who it belongs to.

And for agencies like the FBI, iCloud has quickly become the only way to access data as part of a criminal investigation.

So why did the FBI order San Bernardino County's admins to change the shooter's iCloud password?To prevent the iPhone (that the County issued to the shooter) from backing up to iCloud.So they could cherry-pick this highly emotional case to use as leverage against iPhone security and privacy.Blatantly obvious. Shameful. And they think they can get away with it.

They'll have to move iCloud servers outside the US if they want real security with strong encryption.

Why? Let's say Apple gives you the option of using a pass phrase that isn't stored anywhere on Apple servers or even on your iPhone (well, it IS stored on your iPhone, but only a hash is stored, not the original pass phrase).

Whenever your iPhone connects to iCloud to backup data, it firsts get encrypted locally on your iPhone (using your pass phrase) and then gets sent for backup. The iCloud servers would be storing already encrypted data. If someone got hold of this information it would be useless as they don't have the pass phrase to decrypt it (they'd have to brute force decrypt it, which could take some time depending on the pass phrase you chose).

Apple could even take this further. The "username" the data gets stored under isn't your actual Apple ID, but a token that's an encrypted representation of your Apple ID hashed with your iCloud pass phrase. So even Apple employees themselves wouldn't be able to link a set of iCloud data to a specific user or device (like they can do now, presumably because there's an identifier linked to your Apple ID and device).

Hell, let's forget pass phrases altogether. We know that Apple takes your fingerprint and stores a mathematical representation (a hash) of it in the secure enclave, in a format that makes it impossible to reverse back to the original print. What makes it so impressive is you can use a PORTION of your finger at different angles or orientations on the fingerprint sensor and Touch ID still somehow manages to "match" this with the stored hash of your fingerprint. So why not make a fingerprint as the pass phrase that encrypts your iCloud data? Now you can't forget it, and because of the complexity of your fingerprint you'd have the equivalent of a very long pass phrase (impossible to decrypt).

I don't think security rides so much on WHERE the servers are, but on HOW the data is stored. They could be treated as "dumb" servers who simply store information sent to them, without actually knowing anything about the data itself or who it belongs to.

But, they can still trace the data packets, so could find your data, if not decrypt it (if your using that crazy ass encryption method), by blasting the file location apart (encrypt that too) accross juridictions, you can make even the encrypted data unrecoverable unless you have access to the device. That's crazy ass paranoia, but hey sometimes that may needed. If should be on demand, since such backups, syncs would potentially be slower. Like I said, if you do that there is truly no hope for even the NSA (which in films can do anything) can do anything.

PS: Of course, this slicing is done through Tor or something like that to up the level of crazy even more :-).

Hell, let's forget pass phrases altogether. We know that Apple takes your fingerprint and stores a mathematical representation (a hash) of it in the secure enclave, in a format that makes it impossible to reverse back to the original print. What makes it so impressive is you can use a PORTION of your finger at different angles or orientations on the fingerprint sensor and Touch ID still somehow manages to "match" this with the stored hash of your fingerprint. So why not make a fingerprint as the pass phrase that encrypts your iCloud data? Now you can't forget it, and because of the complexity of your fingerprint you'd have the equivalent of a very long pass phrase (impossible to decrypt)....

I think one of the easiest exercises for a government is to get your fingerprint.Unless you remove you finger print (acid?) or die and get brutally mutilated the government will get access to your data.To my mind the only passkey that can give some security has to be in your mind only. As soon as you rely on physical object (your finger) you are hackable by organizations with enough money to spend.

One thing I don't understand in this who debate, is why apple have been so keen to stress how much they have tried to help the FBI to get at the iCloud data, suggesting ways to access it. So on the one hand they are saying privacy is so important and its a slippery slope to create a system to bypass it, and on the other hand they are suggesting ways to (indirectly) bypass it. It that just because it is technically possible so they would have no defence in not handing the data over?

Ah, Apple the company that all major terrorist organizations around the world recommend for it's member's use.....

And therein lies the rub.

How would Apple protect 500 million iPhone users' privacy... while simultaneously providing law enforcement access to 50,000 terrorists' iPhones?

Imagine what sort of information you could find on people's phones these days: their home address, pictures of their children, their children's school, schedules, emails, access to door locks, garage door openers, health data, etc. Do you really want that stuff to be easily accessible to any common criminal?

I certainly don't. It should be as secure as it can possibly be.

But by keeping that information secure... it also prevents law enforcement from getting into criminals' phones too.

There's no way to selectively make some phones secure while making other phones easy to open.

So what does this mean for users being able to set up a device from iCloud backup? Will that still be possible or are you sol if you forget your password?

That's the issue that they're wrestling with - how to make it secure yet still usable. You'll still be able to set up a device, say a new replacement iPhone, from a backup but ONLY if you have the "access code" or whatever. Maybe like the 2FA "Recovery Key", or perhaps something different.

So what does this mean for users being able to set up a device from iCloud backup? Will that still be possible or are you sol if you forget your password?

That's the issue that they're wrestling with - how to make it secure yet still usable. You'll still be able to set up a device, say a new replacement iPhone, from a backup but ONLY if you have the "access code" or whatever. Maybe like the 2FA "Recovery Key", or perhaps something different.

What happens now if someone using 2FA forgets their password and doesn't have the recovery key? Could they not restore from iCloud backup?