The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

I really am not comfortable with the regularization of this process and other similar ones. Once off requests for accommodation are ok philosophically with me, but I really don't like how law enforcement is farming out the creation of an all-encompassing dragnet to private parties they believe capable of doing the work for them, especially when it is done to circumvent constitutional restrictions.

I guess there are no constitutional issues with this, unlike the warrentless wiretaps, etc., but it doesn't pass the sniff test for me.

The longer the wait, the better. I hope companies like Apple and Google are assigning the absolute bare minimum of resources to these questionable practices, so it can reach a point where law enforcement officials learn to respect proper chain of custody with all digital evidence.

Devil's advocate: What's to stop an Apple employee from tampering with the 'evidence'?

Apple should start including a thermite option in one of it's iPhone models.

I shall marvel at the design of the charge, shaped out of a single, solid piece of thermite. As it burns through my hand, I note how the rounded corners recede perfectly uniformly- engineering of the highest order! As I stare at the bloody pulp on the end of my arm, the iHeat has cut a perfect cylindrical hole through the flash storage. It just works.

The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

I don't know the specifics of the iPhone crypto. Generally speaking, there are usually (at least) two private keys for decrypting mass-distributed file encryption such as those found in the popular OSs. One is a private key which belongs to the local user. That's how you can log into the device and access all the files seamlessly. The other is a recovery agent, which typically belongs to an IT department, or the local administrator, or device manufacturer, which can be used to decrypt everyone's files.

That's probably what's going on. Apple has a recovery agent key which can decrypt every iPhone out there. Depending on how strict their internal security policies are (We don't know... don't confuse their epic product roadmap secrecy with proper computer security) that key could fall into the wrong hands and then every iPhone would be vulnerable from that point on. Apple would need to issue a new crypto chain and decrypt and then re-encrypt every iPhone with a new key pair. This would likely be distributed as an iOS update or maybe through iTunes synch.

Apple should start including a thermite option in one of it's iPhone models.

I shall marvel at the design of the charge, shaped out of a single, solid piece of thermite. As it burns through my hand, I note how the rounded corners recede perfectly uniformly- engineering of the highest order! As I stare at the bloody pulp on the end of my arm, the iHeat has cut a perfect cylindrical hole through the flash storage. It just works.

Assuming the guy is in prison and can't log into the site to wipe it, even if he has one of those shady movie lawyers that will "pass along a message" to someone that will remote wipe it for him... it won't work if the cops just turned the phone off when they arrested him.

The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

That is completely true on open systems and always true in theory, but with the sort of closed hardware security Apple implements it might in practice have something of an exception, just like hardware DRM itself. Like Dilbert said, if Apple's backdoor is using a secondary crypto key then it could still be pretty hard to acquire in general. We don't have enough information though on what exactly Apple/Google can do. Are they really using their own key to decrypt anything on demand of law enforcement, or do they just have better tools to do something else? It's still very poor practice though and why proprietary encryption solutions really cannot be fully trusted, ever. Most people probably won't care, but it's certainly something anyone who cares about security will need to take into consideration.

To remotely wipe a device, you need to have wireless access. This is trivial to block. Furthermore, you would be immediately nailed for utterly undeniable destruction of evidence. So no, remote wipe doesn't apply here at all.

The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

I don't know the specifics of the iPhone crypto. Generally speaking, there are usually (at least) two private keys for decrypting mass-distributed file encryption such as those found in the popular OSs. One is a private key which belongs to the local user. That's how you can log into the device and access all the files seamlessly. The other is a recovery agent, which typically belongs to an IT department, or the local administrator, or device manufacturer, which can be used to decrypt everyone's files.

That's probably what's going on. Apple has a recovery agent key which can decrypt every iPhone out there. Depending on how strict their internal security policies are (We don't know... don't confuse their epic product roadmap secrecy with proper computer security) that key could fall into the wrong hands and then every iPhone would be vulnerable from that point on. Apple would need to issue a new crypto chain and decrypt and then re-encrypt every iPhone with a new key pair. This would likely be distributed as an iOS update or maybe through iTunes synch.

Or depending on the level of security, a third party reverse engineers the key and it is exposed that way. In the end, including such a mechanism leaves you very vulnerable. At a minimum it should be up to the customer to choose whether that key is used, for instance to make it possible to pay Apple or Google to retrieve your data should you lose your key, or not used should you wish for a more secure environment. Of course, this is all just speculation. It'd be nice to know the details of how it works, and if there is a way to disable it.

The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

I don't know the specifics of the iPhone crypto. Generally speaking, there are usually (at least) two private keys for decrypting mass-distributed file encryption such as those found in the popular OSs. One is a private key which belongs to the local user. That's how you can log into the device and access all the files seamlessly. The other is a recovery agent, which typically belongs to an IT department, or the local administrator, or device manufacturer, which can be used to decrypt everyone's files.

That's probably what's going on. Apple has a recovery agent key which can decrypt every iPhone out there. Depending on how strict their internal security policies are (We don't know... don't confuse their epic product roadmap secrecy with proper computer security) that key could fall into the wrong hands and then every iPhone would be vulnerable from that point on. Apple would need to issue a new crypto chain and decrypt and then re-encrypt every iPhone with a new key pair. This would likely be distributed as an iOS update or maybe through iTunes synch.

Or depending on the level of security, a third party reverse engineers the key and it is exposed that way. In the end, including such a mechanism leaves you very vulnerable. At a minimum it should be up to the customer to choose whether that key is used, for instance to make it possible to pay Apple or Google to retrieve your data should you lose your key, or not used should you wish for a more secure environment. Of course, this is all just speculation. It'd be nice to know the details of how it works, and if there is a way to disable it.

Tell me, how does one "reverse-engineer" an encryption key? It seems to me that the more valid concern is that Apple's ethics are dubious, given that this is a service intended to add value for their customers and is instead being used against them with no notice of the possibility having been given at any point in time.

The thing is, if Apple or Google have a way of bypassing the security to remove the data, then it's only a matter of time before third parties find a way to do so as well, thus defeating the security. Basically there should not be any backdoors, or the device isn't secure in the first place.

I don't know the specifics of the iPhone crypto. Generally speaking, there are usually (at least) two private keys for decrypting mass-distributed file encryption such as those found in the popular OSs. One is a private key which belongs to the local user. That's how you can log into the device and access all the files seamlessly. The other is a recovery agent, which typically belongs to an IT department, or the local administrator, or device manufacturer, which can be used to decrypt everyone's files.

That's probably what's going on. Apple has a recovery agent key which can decrypt every iPhone out there. Depending on how strict their internal security policies are (We don't know... don't confuse their epic product roadmap secrecy with proper computer security) that key could fall into the wrong hands and then every iPhone would be vulnerable from that point on. Apple would need to issue a new crypto chain and decrypt and then re-encrypt every iPhone with a new key pair. This would likely be distributed as an iOS update or maybe through iTunes synch.

Or depending on the level of security, a third party reverse engineers the key and it is exposed that way. In the end, including such a mechanism leaves you very vulnerable. At a minimum it should be up to the customer to choose whether that key is used, for instance to make it possible to pay Apple or Google to retrieve your data should you lose your key, or not used should you wish for a more secure environment. Of course, this is all just speculation. It'd be nice to know the details of how it works, and if there is a way to disable it.

No that's not possible at this time. If it were possible, someone could just reverse engineer your own private key. They wouldn't need to touch the Apple's recovery agent (if that's what they've done... I don't know).This would make the public-private key pair based encryption completely worthless.

Tell me, how does one "reverse-engineer" an encryption key? It seems to me that the more valid concern is that Apple's ethics are dubious, given that this is a service intended to add value for their customers and is instead being used against them with no notice of the possibility having been given at any point in time.

Sorry, reverse engineer is not the right description of what I was thinking, rather more along the lines of cracking the key that they have used. Of course that would depend on the type and complexity of the key and encryption used, and I'm sure if this is what they have done, then it's most likely fairly robust, but there might be other means to circumvent that they did not intend. We have seen this in just about every form of DRM to date, the keys almost always manage to get into the wild.

You should just make sure you have someone who can remote wipe the phone.

Only works if the phone has a data connection. If it were me, I'd put the phone in a metal case (faraday cage) first thing, as soon as I got a hold of it, prior to performing forensics work, just to prevent remote wipe.

Obviously different law enforcement agencies will have different levels of tech savvy.

Rural sheriff may only go as far as pressuring the suspect into revealing the PIN number and then examining the phone through the GUI.

FBI may go all out with a faraday cage and decryption and dumping the flash drive data through the dock connector onto their own storage, and going to town examining every raw byte with special forensics tools.

Tell me, how does one "reverse-engineer" an encryption key? It seems to me that the more valid concern is that Apple's ethics are dubious, given that this is a service intended to add value for their customers and is instead being used against them with no notice of the possibility having been given at any point in time.

Unless somewhere in Apple's advertising you can produce quotes of the company promising that its encryption services would be impervious to law enforcement, I don't think you have much of an argument.

Rather, everyone who uses these devices should take note that they are not secure and will not serve as bastions of privacy and protection should you be using them to plan or plot illegal activities. IE, don't expect your cell phone to cover for you if you go off the deep end. That's the healthiest attitude to have on the subject, imo.

This article makes me wonder about iPhone user security practices. It is my understanding that a 4 digit numeric pin on an iPhone can be brute forced in about 40 minutes, in spite of the rather well implemented encryption scheme of the iPhone, which requires that the brute force attack be run on the phone itself, and not an "offline" data image, and which uses PBKDF2.

I always assumed that the vast majority of iPhone users had a very weak pin code only. Am I wrong, or is the ATF just unaware they can use brute force attacks to get this date much sooner than 7 weeks?

the waiting list is probably to avoid abuse of the decryption service ...i would guess that apple des not use the GOD MODE more than twice a day to allow for record keeping... maybe they require a court document and the person to present the phone for Decryption... (meaning two appoinment a day) ... LOL and make them wait like a busy doctors office , or dentist office...LOL

Would this not run afoul of anti-circumvention rules in the DMCA, much like the public has been asking an exception for and is currently being considered?

It's Apple's encryption, they just decrypt using their key or a backdoor they have installed. No circumvention required.

If you're referring to law enforcement, the DMCA won't apply. If they have suspicion that your phone contains evidence of illegal activities, with a warrant they can attempt to get the data off the device in whatever way they can.

The issue here is the perception that Apple is making it easy for them. Traditionally they'd have to attempt to break the encryption using their own tools, potentially at great cost and time. Now they just go to Apple/Google and say "Here's our warrant, give us the data from this." No real effort required.

This article makes me wonder about iPhone user security practices. It is my understanding that a 4 digit numeric pin on an iPhone can be brute forced in about 40 minutes, in spite of the rather well implemented encryption scheme of the iPhone, which requires that the brute force attack be run on the phone itself, and not an "offline" data image, and which uses PBKDF2.

I always assumed that the vast majority of iPhone users had a very weak pin code only. Am I wrong, or is the ATF just unaware they can use brute force attacks to get this date much sooner than 7 weeks?

The way Data Protection works on the iPhone is that data on your phone is encrypted with a randomly-assigned key of significant length. If you have this feature enabled, those keys are then protected using your passcode, which as you mention, can be as few as four numeric digits.

Things get complicated because if you've enabled Data Protection, I believe the default is that the contents of the device are securely erased after 8 unsuccessful attempts at the passcode.

So police are in a predicament:* They can't just brute force the passcode without risking wiping the device* They can't just open the device up and access the flash directly because the data is encrypted with a strong key, and the key can only be retrieved if you know the device's passcode

Why is everyone taking about Faraday cages when any phone thief knows to pop the battery / SIM immediately? Once the phone is off and outside of known WiFi you don't have to worry about outside interference.

What Apple is probably doing is taking the encryption key off the flash and bypassing the exponential delay on wrong passcode entries. I'm not sure what the iPhone caps a passcode at these days but offline attack of that keyspace x10,000 PBKDF2 iterations is not cheap.

This article makes me wonder about iPhone user security practices. It is my understanding that a 4 digit numeric pin on an iPhone can be brute forced in about 40 minutes, in spite of the rather well implemented encryption scheme of the iPhone, which requires that the brute force attack be run on the phone itself, and not an "offline" data image, and which uses PBKDF2.

I always assumed that the vast majority of iPhone users had a very weak pin code only. Am I wrong, or is the ATF just unaware they can use brute force attacks to get this date much sooner than 7 weeks?

These 4 digit passcode hacks only work with you have an iPhone 3GS/4 with insecure bootloader. It dumps the NAND and then works on cracking it on the computer so you can blow through all the combinations rather quickly. An iPhone 4s or iPhone 5 running the latest version is completely secure at the moment. Putting in more than 10 wrong passcodes will wipe the device.

Assuming the guy is in prison and can't log into the site to wipe it, even if he has one of those shady movie lawyers that will "pass along a message" to someone that will remote wipe it for him... it won't work if the cops just turned the phone off when they arrested him.

And it will either be stored in an area where signal isn't strong enough (basement), or in a faraday bag (like these http://www.faradaybag.com/) to avoid communication with the outside world once they confiscate it.

Why is everyone taking about Faraday cages when any phone thief knows to pop the battery / SIM immediately? Once the phone is off and outside of known WiFi you don't have to worry about outside interference.

What Apple is probably doing is taking the encryption key off the flash and bypassing the exponential delay on wrong passcode entries. I'm not sure what the iPhone caps a passcode at these days but offline attack of that keyspace x10,000 PBKDF2 iterations is not cheap.

Not thieves we are talking about. Police. Ideally they don't want to make changes to the original phone. It's tantamount to tampering with evidence. Dump the contents, and examine the dump. Leave the phone in the original condition.

That's probably what's going on. Apple has a recovery agent key which can decrypt every iPhone out there..

Exactly. Chris Soghoian explained Apple and Google's policies with regard to encryption at DEFCON 20. Soghoian says that after badgering Apple over their policy they acknowledged they have a master encryption key. The time delay for the law enforcement to access to the data on a device is a result of Apple's requirement that the police to send the actual device to Cupertino. Apple then clones the phone's data and provides the decrypted data from the clone to whomever requested it via DVD. Apple never actually decrypts the device itself and the police/law enforcement/etc. never have access to anything but the phones data they are provided by Apple. Soghoian also points out that Apple allegedly requires a warrant to do this.

All this talk of Faraday cages to prevent a remote wipe is silly, at least for most iPhones with a SIM. Here's how it's done:Step 1) Turn the phone off once you seize it. It's immune to wiping while it's off.Step 2) Remove the SIM (assuming it isn't CDMA).Step 3) When you're ready to do forensics, make sure there are no public WiFi networks available.Step 4) Power up the phone. Remote wipes don't work if it isn't connected to the internet.

Apple should start including a thermite option in one of it's iPhone models.

I shall marvel at the design of the charge, shaped out of a single, solid piece of thermite. As it burns through my hand, I note how the rounded corners recede perfectly uniformly- engineering of the highest order! As I stare at the bloody pulp on the end of my arm, the iHeat has cut a perfect cylindrical hole through the flash storage. It just works.

All this talk of Faraday cages to prevent a remote wipe is silly, at least for most iPhones with a SIM. Here's how it's done:Step 1) Turn the phone off once you seize it. It's immune to wiping while it's off.Step 2) Remove the SIM (assuming it isn't CDMA).Step 3) When you're ready to do forensics, make sure there are no public WiFi networks available.Step 4) Power up the phone. Remote wipes don't work if it isn't connected to the internet.

It really isn't that hard to prevent a remote wipe.

Is that how it's done by authorities? Do you have a first hand account of this? Or are you assuming that's how it's done?

I would not remove the SIM because that's altering evidence.

I would not assume the phone would not at some point get within range of an unsecured wifi access point. That assumption could throw the entire case out the window. Why take the chance?

Turning off the phone at the moment of seizure, and then dumping the contents in a lab in a bunker/basement/faraday cage, that's feasible.

All this talk of Faraday cages to prevent a remote wipe is silly, at least for most iPhones with a SIM. Here's how it's done:Step 1) Turn the phone off once you seize it. It's immune to wiping while it's off.Step 2) Remove the SIM (assuming it isn't CDMA).Step 3) When you're ready to do forensics, make sure there are no public WiFi networks available.Step 4) Power up the phone. Remote wipes don't work if it isn't connected to the internet.