Posted
by
timothyon Thursday May 08, 2014 @12:11PM
from the as-a-public-service dept.

Trailrunner7 (1100399) writes "If law enforcement gets hold of your locked iPhone and has some interest in its contents, Apple can pull all kinds of content from the device, including texts, contacts, photos and videos, call history and audio recordings. The company said in a new document that provides guidance for law enforcement agencies on the kinds of information Apple can provide and what methods can be used to obtain it that if served with a search warrant, officials will help law enforcement agents extract specific application-specific data from a locked iOS device. However, that data appears to be limited to information related to Apple apps, such as iMessage, the contacts and the camera. Email contents and calendar data can't be extracted, the company said in the guidelines."

All the things listed, are synced to the iCloud. Sounds to me like they are not accessing the phone, but the contents of the cloud server, which have push/pull access to selected apps. Wonder if this is true if you disable cloud access or simply don't sign into it.

TFA says that the data can only be accessed at the company HQ, so no, it seems that they are referring to local data that is unencrypted. It also states that they can access some data in the iCloud, too.

Apparently not. It sounds like they're limited to whatever applications are currently running though:

Upon receipt of a valid search warrant, Apple can extract certain categories of active data from passcode locked iOS devices. Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media. Apple can perform this data extraction process on iOS devices running iOS 4 or more recent versions of iOS. Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data.

So what exactly constitutes a "user generated active file"? Some kind of temp file kept open as long as an app is "open"? And what does "open" mean, really? Shows up when you double-click the home button? Many of those apps aren't really running, if you switch to them most seem to revert to cold-start behavior.

It makes me wonder if there's a paranoia step a person could take before entering a known security zone, like force-quitting the native apps in question, or whether powering the device off does th

Even if I had the source code, it wouldn't do me personally any good as I couldn't grok what it did just from reading it. It would do me as much good as it did 99.99% of OpenSSL users.

Gag letters prohibit what they can say, they don't require them to make false statements of fact. You might make the argument that they could in fact be strong-armed through some extralegal method of making false statements of fact to engender false confidence in potential targets of spying, but that's getting a little into tinfoil hat territory.

In fact, I think an Apple statement of what little they can extract is pretty good and serves as a kind of interesting statement on what they believe is recoverable. It doesn't include third-party techniques or equipment that you might find in an NSA laboratory, but I don't know that Apple makes that kind of penetration test of their own devices.

They mean functional. If you break the screen, they may still do it. Drop it in some water, though, and it may be hosed enough for them to not bother. (Really, the "in good working condition" statement is there for one purpose: it says that they won't go to any extreme measures to make it work. They have a process in place for doing this, and if it's successful, they'll give you the data; if it's not, they're not doing experimental forensics for you.)

From the document: "Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data."

If you read Apple's document [apple.com], they make it pretty clear in Section I that they're talking about extracting data from an iOS 4 or later iOS device that is passcode locked and in good working order. Besides which, not all of that data goes through iCloud (e.g. call history, audio recordings (unless you're backing them up), etc.).

Moreover, they've detailed the security of their iCloud offerings before, and what I noticed immediately is that while SMS texts can be extracted according to this document, iMessages are not listed, suggesting this isn't just an iCloud backdoor. Likewise, if they were able to access your iCloud stuff, they'd have access to a whole lot more, such as calendar events, e-mails, and any third-party data you had backed up using iCloud Backup.

Likewise, if they were able to access your iCloud stuff, they'd have access to a whole lot more, such as calendar events, e-mails, and any third-party data you had backed up using iCloud Backup.

From the source you linked:

iii. Email Content
iCloud only stores the email a user has elected to maintain in the account while the customer’s account remains active. Apple is unable to produce deleted content. Apple will produce customer content, as it exists in the customer’s mailbox in response to a search warrant.

iv. Other iCloud Content. PhotoStream, Docs, Contacts, Calendars, Bookmarks, iOS Device Backups
iCloud only stores the content for these services that the customer has elected to maintain in the account while the customer’s account remains active. Apple does not retain deleted content once it is cleared from Apple’s servers. Apple will produce customer content in these categories only in response to a valid search warrant.

How about google, hotmail, facebook etc passwords from Safari's settings? Thats what law enforcement always look for. That is cop gold right there. Who gives a crap about the data in the calendar app, thats all hosted on apples cloud anyway.

Wouldn't law enforcement just require the account usernames and then get the data from the respective service providers with a warrant? Sounds a bit unprofessional that they would go logging in to the accounts by themselves.

Wouldn't law enforcement just require the account usernames and then get the data from the respective service providers with a warrant? Sounds a bit unprofessional that they would go logging in to the accounts by themselves.

You've never been in court have you?

The primary legal argument in most cases in this country are: "Well we're the police we can do that. Constitution? Sure you could appeal this but the fines $500, you're legal fees on appeal would be at least $5000... tell you what, pay the fine and we expunge the charges in 6 months!"

Yes, this has happened to me. I even got a ticket once for "unlawful use of horn" when I honked at a guy that almost hit me. But he was the cops uncle (cop told me this) he then proceeded to tell me "Sure this would get thrown out of court, but I get paid to go to court. You don't. I can give you a ticket every day you drive through here. How long would you keep your job? Now how about you stop being a jerk and honking at old people?" I called the police station later and spoke with the guys boss who laughed at me and said his officer told him "Some jerk will be calling you..."

The police only follow proper procedure and what-not when they think the case is big enough that it'll mater... i.e. you're going to jail and they know you'll fight tooth and nail. Otherwise they just search illegally, bully and batter people, contaminate evidence (if they even bother to collect any) and then slap a fine on you. If the fines aren't over a couple of thousand and there's no jail involved, its almost always in your financial best interest to just roll over and take it. In the few cases where the person doesn't? They don't care, 100 other people got arrested on the same day.

And here is the question. Is it accessing the phone, in which case a remote wipe can protect the citizen from a warrent, or is it accessing the 'cloud' in which case the courts have ruled that because you have shared the information with a third party, i.e. your service provider, the privacy of the data is much more limited.

I don't have as much issue with this kind of police state antics as some other things because these kind of communications just don't seem to have as much expatiation of privacy. Lik

They require the phone to be shipped to Cupertino, in good working order. So I'm guessing that if you execute a remote wipe (which, on an encrypted iPhone constitutes the disk controller basically forgetting the encryption key), that law enforcement is fucked. And, because we're not talking about a magnetic medium, there's very little forensic recovery possible.

How about google, hotmail, facebook etc passwords from Safari's settings? Thats what law enforcement always look for. That is cop gold right there.

No, that is prosecutor cyanide. Cops do not generally log in with the user's credentials, because it poisons the evidence gained from that site. Any competent defense attorney could get the subsequent evidence found that way thrown out almost immediately ("So, officer, you logged in as the user and acted on his behalf in the website? How do we know that you and your cohorts didn't plant the evidence yourself? Tainted evidence, yerhonor!")

Easier to get a warrant, have the provider give you the data. That way you can have a valid chain of custody, proof that there was no impersonation by cops or prosecutor, and absolutely no chance of any claims being valid that questions the veracity and integrity of the evidence found. Hell, even in those few cases where a user/pass is used, both prosecution and defense attorneys are present during its use (and depending on locate, a clerk of the court) - the defense (and clerk) are there to keep 'em honest.

How much is threat post paying timothy to drive up their traffic with these half ass stories?

The summary fails to mention that the phone must be in their possession and the both the phone and the search warrant must be delivered to Apple's headquarters which is the only place Apple will perform the extraction.

If anything I applaud Apple for both publicly disclosing their policy for dealing with law enforcement and requiring a search warrant with more detail than "suspect's phone". They require the model number, phone number, serial of IEMI number and FCC ID number.

"The device’s unique ID (UID) and a device group ID (GID) are AES 256-bit keys fused into the application processor during manufacturing. No software or firmware can read them directly; they can see only the results of encryption or decryption opera- tions performed using them. The UID is unique to each device and is not recorded by Apple or any of its suppliers. The GID is common to all processors in a class of devices (for example, all devi

Going a few sentences up: "Every iOS device has a dedicated AES 256 crypto engine built into the DMA path
between the flash storage and main system memory." So all one needs to do to bypass that encryption layer is access the data through the DMA path (which I'd imagine to be a set of copper lines on the PCB). So no specialized equipment that interacts physically with the silicon necessary. And note that the user's passcode does not come into play at this level. More info here [anthonyvance.com].

But you'd have to prod that data to pass that line though. That's probably why they can only access the data that is basically already visible through the locked front screen (messages, photos).

Either way, if my phone were confiscated for whatever reason, the first thing that would happen is a remote wipe - basically a deadman's switch, if my phone doesn't check into a server every 12 hours, it wipes. Backups are also encrypted and can be restored in less than 5 minutes (there is no data locally that isn't

"iMessage" is a message transport. The app is "Messages". The document from Apple specifically says "SMS": it does not mention either Messages or iMessage. While it's possible that Apple leaves iMessages unencrypted on the device, it would be surprising given how much trouble they go through to protect then in transit. So while this document doesn't explicitly say iMessages are safe, it also doesn't say they're vulnerable.

- Apple cannot monitor FaceTime or iMessage conversations since they are end-to-end encrypted

- Apple cannot provide third-party app data that is encrypted since the files are encrypted with the user's passcode.

- It appears if the user does a remote wipe before law enforcement can get a warrant and ship the phone to Apple (or fly it there), then there is nothing that can be done. I wonder if they power up the device in an anechoic chamber so it can't receive the remote wipe signal? I would guess no because most people aren't smart enough to do an immediate wipe.

- We already knew the only trick they have as far as encrypted files goes is a custom firmware that bypasses the max attempt auto-erase and rate limit feature, so it can attempt to brute-force passcodes quickly. However it requires the attempt be made on-device, since the keys are stored in the secure storage with no facility to get them off-device. So even a moderately complex passcode is effectively unbreakable, let alone a good strong password.

Questionable:

- user generated active files (this is what SMS/call logs/photos/etc are listed under). Normally if a device is powered off and rebooted, I was under the impression that these things were not available because the files are encrypted. It seems that iMessage is at least encrypted here, but I would be curious to find out what the situation is. Everything except photos, videos, and recordings is a moot point because you can get stuff like SMS history and call logs from the carrier anyway so those are the only ones I'd be concerned about.

There are some definite good points here - Apple has chosen not to build themselves backdoors or workarounds, presumably because they can't be ordered to disclose information they don't have access to... same reason they built iMessage the way they did. A court would have to order them to refactor their software before it could order them to intercept messages, and at least in the US there is no precedent or law that can compel them to do so.

However I would expect the âoeuser generated active filesâ to be encrypted after a device reboot until the passcode is entered. If that is not the case, Apple should fix it pronto.

I would also expect Apple to refactor the storage of those things to be segmented, given the NSA revelations and increasingly authoritarian behavior of law enforcement; for example, photos pending background upload could be kept unencrypted, but once uploaded they should be rewritten as encrypted so they require the passcode to access. They already have the ephemeral key tech and per-file key support so you can generate a key for the unencrypted file while the device is unlocked, then toss the passcode key when the device locks and only hold onto the file key until the upload is finished, then toss it. Thus no risk to the main key but you can still encrypt the file in the background.

I won't bother discussing Android phones - they are almost all trivial to break and access all the user's data, when people like Samsung aren't coding back doors directly into the firmware.

I do not claim to know details... but as you mentioned a remote wipe won't work on a phone that is powered off and there are things known as Faraday Cage that should block signals once the time to power on the device and take evidence off it arrives.

If at this point people are still surprised that this is possible then they are just naïve. Privacy in public forums (internet being the biggest forum of all) is not possible in this current age. Other than my personal information I don't care what people know or get from me. Some people have a dark past and don't want information to leak but I honestly have nothing to hide so I don't care.

Think of it this way: We are all Truman in the Truman show. The public is watching and so are the officials. Crook

No, think of it this way: You don't understand what is and is not a "public forum". The "texts, contacts, photos and videos, call history and audio recordings" stored on your personal phone are not accessible in a "public forum", and Apple is somehow (allegedly) pulling these things from your device remotely (heaven knows why the security model even allows this to happen) at the behest of law enforcement.

Other than my personal information I don't care what people know or get from me.

What if people knew you were an idiot? Congrats, you just displayed it in a public forum.

No, think of it this way: You don't understand what is and is not a "public forum". The "texts, contacts, photos and videos, call history and audio recordings" stored on your personal phone are not accessible in a "public forum", and Apple is somehow (allegedly) pulling these things from your device remotely (heaven knows why the security model even allows this to happen) at the behest of law enforcement.

The whole thing and how it works has been well-documented for a long time.

First, an iOS device's flash storage is always encrypted. The encryption is basically unbreakable. But obviously, the iPhone can still read it. That's because you enter your passcode, and that passcode is used to unlock the data.

The bit of code where you enter your passcode is written and signed by Apple. Only code that is cryptographically signed by Apple is capable of checking a passcode and with the right passcode giving acce

what hope to you have against the government with all their money and resources.

Given that the App you mention and Apple's list of what they can extract amount to the same thing, it's probable the government also can access the same things. Basically anything that not encrypted on the device or backup can be accessed by all (with physical access). Things that are encrypted can't be. Even by people working for scary 3 letter acronyms.

I had someone give me an iphone 4 last year where a child playing with the phone had accidentally deleted all the pictures. My task was to recover all the deleted pictures. It took me a few hours, mainly because I had never done anything with an iphone before. The process that worked invovled booting the phone with a different bootloader and breaking the encryption key. Most of the information and software to accomplish this can be found with a few minutes of searching.

I posted this elsewhere in the thread, but this describes the iOS security mechanisms in excruciating detail, including the full-disk encryption, etc. etc. Note that it does vary by hardware platform (3GS, 4, 4S, 5, 5S) and iOS version, so this is the "new hotness". There's a lot of incorrect information in the comments.

So, let me understand your point better. You're saying that you believe what Apple publishes on its own security mechanism?

Don't you?

Remember, we are Apple's customers. We are the people paying Apple. How much money do you think does Apple make by supporting law enforcement? I'd say $0 if they are lucky, but quite possibly a loss. What interest does Apple have in reading your data or making it available to someone? Apple's biggest source of profit is selling phones, followed by selling tablets, followed by selling computers. Just like Google, Apple's interested in keeping their customers happy so they keep paying money. Unli

Sorry, I was too brief. Apple doesn't include a file manager because thy want to try to control the experience. (Bad enough).
MS doesn't include a file manager because they can't do it without totally destroying security on the device. At least that is their official story. I think the real answer is much worse.

If passcode-protected whole phone encryption is enabled, no one should be able to access that without the key. I guess they know how it works more than I do. They've even redefined encryption. It's "encrypted" just like everything else these days. I guess it's still technically encrypted even if everyone has a key.

My understanding has been that they are capable of bypassing the OS restriction on unsuccessful login attempts before the phone's data is wiped. Since most people just use a 4-digit pin, it wouldn't take very long to brute force even if they don't know what the salt is.

there's no back door. Apple's iCloud syncs some information across all devices. For ex if I take a photo with my iphone it automatically syncs with my ipad and my macbook. obv the photo must be uploaded from the phone and live on an apple server somewhere, so it's vulnerable to supoena.

in other news, apple will begin notifying users of supoena requests LINK [macrumors.com]

Technically, there *is* a backdoor in the sense that Apple signs the ramdisk with their private key. As such, should they build and sign a "data recovery ramdisk" with their private key and supply said software to Law Enforcement (such as when subpoenaed), then one can boot to DFU, load the "data recovery ramdisk", mount the phone as read-only flash that the agency can copy data from it.

Any entity with the private keys control what happens with the data on the device.

They don't supply shit to law enforcement - their policy [apple.com] says that the device has to be shipped to Cupertino in good working order, where they will do the data extraction only with a proper search warrant or court order. The data is then provided on optical media:

Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media. Apple can perform this data extraction process on iOS devices running iOS 4 or more recent versions of iOS. Please note the only categories of user generated active files that can be provided to law enforcement, pursuant to a valid search warrant, are: SMS, photos, videos, contacts, audio recording, and call history. Apple cannot provide: email, calendar entries, or any third-party App data.

See section I of the linked document, entitled "Extracting Data from Passcode Locked iOS Devices".

All phones probably use the same salt so it's a backdoor it also means that someone out there will find that backdoor.

The lack of thought... What happens when the passcode screen comes up and you type in your passcode 1234? The software takes your passcode, 1234, and no other input that isn't directly available to the passcode software, and unlocks your phone. A police officer taking your locked phone takes five seconds to type a passcode, and your phone gets erased after ten attempts, because that's what Apple's passcode software does.

Apple can replace the passcode software. (Nobody else can, because only software code

If passcode-protected whole phone encryption is enabled, no one should be able to access that without the key. I guess they know how it works more than I do. They've even redefined encryption. It's "encrypted" just like everything else these days. I guess it's still technically encrypted even if everyone has a key.

Not everything is encrypted. According to the guidelines:

Specifically, the user generated active files on an iOS device that are contained in Apple’s native apps and for which the data is not encrypted using the passcode (“user generated active files”), can be extracted and provided to law enforcement on external media.

So, data can only be extracted if it is not encrypted. Sounds reasonable. Of course it would be better if everything was encrypted.

Every iOS device has a dedicated AES 256-bit crypto engine built in that is used to encrypt all data on the device at all times. In addition, the iOS Cryptographic Modules have been granted FIPS 140-2 compliance by the U.S. federal government on devices running iOS 6.

MS on the other hand, really don't know how to build a filemanager for their phone, so they gave up.

I'm honestly surprised when someone on MSDN knows the precise reason something works or does not, their own code probably looks like muck to them, too. Keep going through these exercises of "try this..."

OT - I'm not surprised. Is anyone surprise? Apple is the private sector equivalent to the NSA.

The news is the Apple has received enough LEA requests for information that they've put together guidelines as a pre-emptive against being bothered about things they can't do.

I suppose we could be heartened that it specifically states upon receiving a warrant thus-and-such are available? Until a three-letter agency gives them a Sekrit Not-A-Warrant Order requiring the information. And that, Government, is the whirlwind you reap when you play fast and loose with the Constitution - there should be no trus

The page states that they can only access information which is not encrypted, and is "active", whatever that means. Reading between the lines, it seems they can get at information that's currently in RAM.

The story here is that Apple can unlock and access the files on an unencrypted iPhone. That shouldn't come as a surprise to anyone. You can do that without Apple's help, and you can do it to unencrypted Android pho

I haven't actually disassembled an iPhone to see if it has an exposed JTAG header. I've connected to a lot of other consumer devices with JTAG, though. It's extremely common to disable JTAG entirely on the devices that are sold to consumers (though the header and traces are still there, they just don't do anything). Most devices where it does work only talk on JTAG if the device powers up with something connected to the header -- which eliminates using it for RAM access for forensic purposes. Lots of densel

If there were a master key, they would be able to get the whole system and also, it would be trivial for someone to find/leak that key and every single device would be at risk. Also, having encryption with 2 simultaneous private keys is impossible if you don't have access to both keys at time of encryption (and hardcoding a key in software would defeat the purpose of the encryption all-together). The device self-destructs when attempting brute forces so that's not it either. I'd say they can access 'some' d

The AC nailed it; this is an utter non-story. Last time I checked, locking an iPhone does not enable full-disk encryption. Raise your hand if you thought the iPhone contains some magical Steve Jobs fart that would prevent someone with hardware access (leave alone Apple with hardware access!) from ripping the unencryped data (which, in a default setup, is essentially everything except [luxsci.com] your e-mail [zdziarski.com]) from the flash chips. And yes, hardware access is necessary even if it isn't explicilty stated in the summary. Anyhow, those that did raise their hands earlier, please hand in your geek card and don't let the door hit you in the ass on the way out.

I was thinking about the FROST attack against Android devices. Sounds like something similar here - lower the temperature enough to get the phone to reveal its encryption key in RAM, then just read the key off the RAM chips. Now you have the key to decrypt all of that lovely cloud data yon LEO has been after.

Yeah, and if you skim through that document for a few minutes, it becomes clear that the encryption is not applied to the whole disk, and that only apps that use the Apple Data Encryption API benefit from it. The only app that does this in a stock configuration is the e-mail one, so I stand by what I wrote. More about this here [slashdot.org] and in the 2 links in my previous post.

Okay, I had a closer look at p. 8-9, and together with information from yet another source [anthonyvance.com], I have to concede it's slightly more complicated than stuff not being encrypted at all and "ripping directly from the flash ships" being possible. Yes, the whole "disk" is behind one layer of encryption, but all a person with hardware access has to do to get around this is access the flash chips through the regular DMA data path (which in many scenarios may be simper than reading directly from the flash chips anyway)

Raise your hand if you thought the iPhone contains some magical Steve Jobs fart that would prevent someone with hardware access (leave alone Apple with hardware access!) from ripping the unencryped data (which, in a default setup, is essentially everything except your e-mail) from the flash chips.

*RAISES HAND*

From iOS 4 onwards, all disk data is encrypted if you have set a passcode. Hardware access to the flash chips won't help you.

And the only people that don't set a passcode are people that don't care about security. Without a passcode Law enforcement don't need Apple's help. They just open the app and read the data.

And yes, hardware access is necessary even if it isn't explicilty stated in the summary. Anyhow, those that did raise their hands earlier, please hand in your geek card and don't let the door hit you in the ass on the way out.

Either you are badly misstating what you believe, or you already lost your geek card.

Yeah, that's what you'd think if you were to skip the fine print. In truth, the "disk encryption" in iOS 4 is not full disk encryption. An app has to specifically request for its data to be encrypted through the Apple Data Encryption API, and of the default apps, only the e-mail one does that. More details in the two links in my last post (which date from the first half of 2013 and are specifically talking about iOS 4). I assume they did this for performance and battery life reasons.

Hmm. We're both right in a way. This is why I said about maybe you were misstating what you believe.

All files are encrypted. You know what a remote disk wipe does on iOS? It deletes the encryption key(s), nothing more. It doesn't delete the data. It doesn't have to because without keys, the data might as well be random bits.

What's causing you to be mistaken is there are different categories of file protection on different files. One is called "No Protection", but it isn't no encryption.

I finally got to the bottom of it [slashdot.org]. We were indeed both right in a way. There are two layers of encryption, one that is always on, and a second one that is only engaged through the Apple Data Encryption API. However, for the one that is always on, the decryption is also always on (without the user needing to enter their passcode), so it might just as well not be there (except for the remote disk wipe feature). There's nothing a hardware hacker needs to do to bypass the always-on decryption, so from that poin

Blackberry's BBM message facility is the most secure in the business. Which is why Blackberry's are the criminal's first choice of phone. I'm not just saying that, the London looting "riots" of a few years ago were organised by criminal gangs and they used BBM to do it.

Apple's pretty secure though. If you want to see a real sham, look to Android - remove the SSD from most Androids, and you have all the user data right there, unencrypted. Users have to take active steps to encrypt stuff. And how many do that

Blackberry... wasn't that the company that sends all your mail and everything you ever communicate through their servers?

You don't understand how blackberries work.

Yes, they send your data though their servers, in the same way that your data goes through your cell phone company.

BUT, with a blackberry enterprise server, Blackberry does NOT have the decryption keys. That is the relevant point - even if Blackberry wants to hand over information to law enforcement, Blackberry isn't able to decrypt the data.

Wasn't there a story a couple of years ago that Blackberry DID have backdoors to both BES and their own system and shared it with not just US but also Indian and other governments around the world.

Neither Blackberry nor any other corporation is to be trusted, as long as your security is closed source or you have no control over it, it is to be seen as compromised. Use open source security on your OWN systems, that's the only way to be halfway sure that there are no immediate backdoors.

It's been known for a while that their "Filevault" has a corporate key (allegedly for employees but wouldn't it work for anyone?) to unlock it.

Oh my god. When you turn Filevault on, it displays a 20 digit hex string which you can write on a piece of paper, hide in your cupboard, and use to decrypt the hard drive if you forgot the password. Alternatively, in an enterprise setting, where your Mac is under company control, that same 20 digit hex string can be sent to your company, so they can decrypt your drive if you unexpectedly leave the company. And third alternative, you can enter three security questions + answers, the same 20 digit hex string

It's been known for a while that any enterprise-grade encryption software worth talking about can do that. It's called key escrow, and it's necessary to recover company data should the user leave / get fired / forget their password / etc.

It takes more than that to wipe the data. Look at the specs for military secure communication equipment - you need a mechanism to actively destroy all data present in the event of any detected tampering. It'd be nice if that mechanism were proof against accidental implementation, yet robust enough to prevent intentional intrusion.

Military grade technology will cost military grade bucks and will not be made generally available to the public. It will certainly not be made available for import/export on an

All my iDevices always have all new apps on them that I get on any one device, automatically. Means I can get an app in iTunes and my wifes phone will get it automatically, so I don't have to send her searching for it.

Since its optional, they certainly have the ability to do so, its up for debate as to if they can override the choice you set on the device.

Well, as their actual policy states that the law enforcement agency must deliver the actual phone in good working condition, with a search warrant or court order specifically stating the IMEI and FCC ID of the device on it to Cupertino in order to get data extracted, I'm guessing that they can't simply toss a data dumper on it.