Posted
by
samzenpus
on Wednesday May 25, 2011 @04:21PM
from the what-took-so-long? dept.

adeelarshad82 writes "Russian company ElcomSoft is claiming to have cracked the 256-bit hardware encryption Apple uses to protect the data on iOS 4 devices, and is offering software that allows anyone to do it. ElcomSoft can now gain full access to what is stored on a gadget such as the iPhone 4. This includes historical information such as geolocation data, browsing history, call history, text messages and emails, usernames, and passwords."

My pattern includes 6 dots. From each element, I have at least 5 choices (hint, you don't have to use the nearest dot, you can go between them to skip a dot).

So there are at least 15625 6 element pattern combinations. Actually there are more because from some dots you have more than 5 choices (like the center dot where you have 8 choices). It turns out that this guy already did the math:

Which is exactly why you try it first. Apparently you aren't aware of the Gizmodo password frequency analysis, which is surely repeated almost everywhere that doesn't absolutely require the use of strong passwords to enable service.

Has anybody done a study on patten unlocks and what frequency we humans pick the same patterns? Because while I don't have a Droid just watching the video posted above my guess is along with the "left to right G shape" there will probably be a lot of people doing something like a cross, 4 box pattern, etc. We humans tend to like certain shapes and tend to go for those, such as tic tac toe.

So if anybody has a link comparing pattern matching to password guessing my bet is it would be interesting to see whic

I'd doubt the data is available to do such an analysis, and until touchscreen entry becomes common to access centralized software services it's likely to continue to be unavailable. With passwords, it's much easier to do as a result of the high number of compromised accounts when a large service like Gizmodo gets hit.

That said, I'd be inclined to agree with your guess about certain swipe patterns likely accounting for a very large percentage of devices that use that particular method of unlocking. People, w

Dictionary attacks on passwords tend to not use traditional dictionaries. Rather they use dictionaries of passwords that have been exposed via fishing attacks and then publicized.

All that has to happen is for someone using the same password as you to fall for a phishing attack and you will be vulnerable to dictionary attacks, even if your password looks something like: XHdHNP4S.

If that password has been exposed and is in the attackers password dictionary, you are vulnerable.

Yes! So it's a device design flaw, the encryption itself is pretty secure if used properly. I see their software also lists Blackberry. Better change my 4-digit password too! gulp.

Here is a great analogy of how strong the encryption is, if a secure password is used:

Imagine a computer that is the size of a grain of sand that can test keys against some encrypted data. Also imagine that it can test a key in the amount of time it takes light to cross it. Then consider a cluster of these computers, so many that if you covered the earth with them, they would cover the whole planet to the height of 1 meter. The cluster of computers would crack a 128-bit key on average in 1,000 years.

If you want to brute-force a key, it literally takes a planet-ful of computers. And of course, there are always 256-bit keys, if you worry about the possibility that government has a spare planet that they want to devote to key-cracking.

It seems like this would work on any phone, in principle. If you're using a 4-digit numeric password to protect your phone, any kind of phone, yeah, somebody's eventually going to crack it in a non-end-of-the-universe timeframe, if they get unattended access to it, and you don't remote-wipe it.

Use an alphanumeric password to protect your phone. Also, it's got a ton of your stuff on it, never leave it unattended for extended periods of time, never give it to people you don't trust. A cellphone is a very personal frob and no amount of engineering is going to make it safe from hacking, modulo the sensitivity of the data contained therein -- even if you pick a 20 char, completely random password, nefarious folk can still dust the screen for fingerprints, or surreptitiously videotape you unlocking your phone...

If you're using a 4-digit numeric password to protect your phone, any kind of phone, yeah, somebody's eventually going to crack it in a non-end-of-the-universe timeframe, if they get unattended access to it, and you don't remote-wipe it.

Unless you limit the number of failed attempts (and then brick/erase the device), or have an increasing delay after each failed attempt.

It seems like this would work on any phone, in principle. If you're using a 4-digit numeric password to protect your phone, any kind of phone, yeah, somebody's eventually going to crack it in a non-end-of-the-universe timeframe, if they get unattended access to it, and you don't remote-wipe it.

Well, on most phones (like Android ones) you don't need to go that far. The password ist just for protecting you against someone using the phone, but since the file system isn't encrypted at all on most phones, you can just dump the data and be done with it.

The application is called the ElcomSoft Phone Password Breaker and costs around $320 for the Professional edition.

So this is not going to be another way to get your own apps onto the iPhone without jailbreaking, but rather reducing to a $320 barrier and sufficient period of time of your not having possession of your iPhone modulo the weakness of your passcode to your plausible deniability that someone has planted something on or used your iPhone for nefarious purposes without your knowledge.

Remember, the answer to the question "Has this item ever left your sight?" is always "Of course it has." The question is to establ

You can bet that US and other law enforcement have probably been given the keys already. After all, how else would those [unconstitutional] mobile phone searches of US citizens used during US border crossings be able to work so easily and efficiently?

From my reading of their FAQ, it seems that this tool can be used to decrypt the encrypted backup images that iTunes takes when syncing the phone, not the phones themselves.

Am I wrong? If it's the backup images, then I see the potential attack vector as slightly less serious as an iPhone is usually a lot easier to lose / have stolen from you than the machine you sync it with.

...security is already compromised. We've known this forever. This new method requires 40 minutes of physical access to the phone. Either your phone has already been stolen, in which case they have all the time in the world to try number codes until it opens up for them, or it's been taken by the police, in which case you can probably be compelled to provide the codes necessary to access the device. Either way, this doesn't change too much. And if either of those concerns you as being too risky, why were you using a mobile phone to keep sensitive information in the first place, instead of something designed specifically to hold confidential information?

Unless you encrypt your backups and forget your password or your backups are stolen, its pretty much pointless.

I really don't see the point in encrypting my backups because well, if someone can get to my backups, they'd be far better off just taking the source data off my laptop.

Seriously, by the time someone can get to your backups, they have a larger more important device at their finger tips... you know, the device that the iPhone got the data from in the first place, just use the source.

I do off-site backups by rsyncing to a TB-disc in the basement of a co-worker. (and he does his off-site backups by rsyncing to a disc in my basement)

This gives us both reasonable security against possibilities like flooding, fire, burglars or lightning-strikes that could all potentially destroy both my laptop, and all in-house backups at the same time.

By using encfs for the backup, I preserve the property of only needing to sync changed files, but at the same ti

Looks like TFA didn't read TFA.Or misunderstood it big time. All of the comments are also about their OLD TOOLS which are related to brute forcing and analysing the BACKUPS and have nothing to do with this hardware encryption getting cracked.If you read the blog post they say there that there is some data that's not included in the backup that you can access with the hardware encryption keys.Also they're saying they don't want this ending up in the "wrong hands" and will only offer it to governments and suc

I have an iphone, and several other phones (blackberry, android etc)...And correct me if i'm wrong, but when i power on the devices they boot up, and then automatically start talking to the network and retrieving email etc...

Surely then, even if the data stored on the phones built in flash is encrypted, the key to that encryption must also be on the phone somewhere in order for it to boot on its own, otherwise it would require the key be entered in order to boot at all.

Apple should just offer the means to view in regular format your stuff on your iphone in order to do easy backups....drag and drop from your device into a windows folder in order to have more control over the file system. Heck, the only reason why i would consider this tool, is to make sure my backups are properly made, that itunes has to be the worst piece of crap software i have ever used.....this whole thing with trying to manage your allowed devices vs. trying to limit who will replicate the data (if at

Ahh, I love when people with no clue repeat crap they found on the Internet.

Show me something that doesnt' generate keys using an algorithm... I won't be holding my breath. Any good security system uses an algorithm for key generation... with a RANDOM mutator. Not all keys are created equal, some are known to be weak, throwing those out is paramount and users simply aren't worth shit at generating random keys, so you use an algorithm known to generate strong keys with a random mutator.

Okay, lemme see.... I have a password "hunter2". I also have supersecret porn I'd like to encrypt with AES. I'd like to use my human-rememberable password for the encrypted AES data.

Now, I challenge you to make a 256bit key from "hunter2" without using an algorithm to generate it... And yes, I'd do like to be able to decrypt it with the same password. And no, "hunter3" should *not* unlock it.

If you can do that, then I will admit that you *DO* know something us normal hobbyists don't.

So why doesn't the fantastic mathematically complex encyption ever work? Why should I trust https? Or any other encrypted transmission?

Encryption does work: the flaw is normally in the key handling.

There's a fundamental incompatibility between security and convenience: people encrypt the data on their phone with 256-bit AES using a password of 'password' and are surprised that it can be broken. Or they rely on the phone to encrypt their data with a key that is... stored on the phone.

Good encryption requires a good "key". Forget password, think passphrase.
Encryption is great when it's somebody intercepting your messages or data, but not so useful when they have access to an endpoint.
The effectiveness of a good lock is severely reduced if you can't remove the keys from it. Most hardware like this has a copy that can be gotten at by the diligent. It's how bluray ended up losing it's DRM.

They are liars. This tool just does a brute force attack against a backup of the device, then once the key is found it can be used against the actual device. If you have a simple password this might work, if not too bad for them.

The only lessons here are always use long passwords and "security" companies are often 1 shade off of scammers. Even simple phrases like "And its fleece was white as snow" makes a decent passphrase due to length, changing it to "And) its( fleece* was6 white5 as4 snow3" makes it even

Even simple phrases like "And its fleece was white as snow" makes a decent passphrase due to length, changing it to "And) its( fleece* was6 white5 as4 snow3" makes it even better and still easy to remember.

And you're going to type that in every time you use your phone?

Coming up with a good passphrase is much easier than convincing people to go to the trouble of using one.

Well, the flaws are always implementation details. Implementation details are usually botched in mobile devices, for convenience of the designer and (perhaps) because of hardware limitations, and in web applications, for the sake of interoperability and usability. And stupidness. Don't forget the stupid.

But, if you use a known good implementation (as much as it can be known, but pretty good with some FOSS) yourself (not implemented by a web service, but by you on your machine), then it's much less likely to be vulnerable, because the convenient and intentional weaknesses tend to be eliminated.

The encryption itself is solid. What falls most of the time is the specific implementation. Say for example I made the choice to encrypt my hard drive but didn't use an already baked system like Ubuntu's home drive encryption. Instead I decided to do it by hand and code my own pre-boot initramfs to automatically handle decryption by hashing some hardware specific identifier from the bios. Except that since I'm not a security expert, I made some foolish coding error which allowed the hash to be intercepted o

Rest assured that if HTTPS's implementation of encryption were cracked, it'd be news, and you'd know (I assume).

SSL 2 has been cracked. Weak ciphers used in SSL 3 and later have been cracked. SSL renegotiation has been cracked. Root certificates owned by governments whose interests are not aligned with those of the United States and western Europe have been included in major web browsers' default repositories. And yes, they were all news.

Renegotiation was not "cracked". Renegotiation worked as intended - it is the software that used renegotiation that failed to view the two streams as separate connections, as it should!

Except that renegotiation was developed by the very same people at Netscape and for the same specific purpose that it got used for: changing crypto parameters and client certificate authentication after the HTTP request had been made.

Folks have a hell of a time understanding the difference between security and cryptography, and the misleading sensationalist headlines don't help.

Cryptography is merely the study of hiding and unhiding information. It doesn't secure information. Security is about securing information from unauthorized access. These guys attacked the security of the device, probably through the protocol or through insecure hardware.

If the crypto itself (probably AES-256) had been broken, the NSA would have had some

Also, nobody speaks of 256 bit RSA in this century; the recommended key size for use with a 128 bit block cipher is 3072 bits when I last checked.

You only need a key size that big if you're doing asymmetric keys -- see Schneier and ridiculous key lengths [schneier.com]. The encryption on these phones is symmetric, and the reason it's so easy to crack is the 256 bit keys are in fact selected from a very restricted space: they just take four numeric digits from the phone entry and then maybe hash them to get better bit cov

Read the part of my post that you quoted, and you'll see that I did not say anything that contradicts what you or Schneier said. When using a 128 bit block cipher, the recommended size for your asymmetric keys is 3072 bits for non-ECC algorithms (e.g. RSA). That is not a ridiculously long key size, given the state of the art attacks on the RSA problem, nor does it exclude the smaller symmetric key size. I was responding to a statement about "256 bit RSA," which is ridiculously short.

So why doesn't the fantastic mathematically complex encyption ever work? Why should I trust https? Or any other encrypted transmission?

Because encryption cannot get around having physical access to the device. Even being on the same network (subnet) makes things measurably easier as most OS's don't do anything about a brute force attack.

Once you've got physical access, you can easily use brute force to crack encryption, your only limitation is time. Reading the article, they have physical access to the devices they are cracking. Considering that ElcomSoft makes tools for forensics not attackers it makes sense that you'd have physical ac

You don't do drive encryption with asymmetric encryption, not if you actually want to use your data at any reasonable rate.

You generate a large key for symmetric encryption, then encrypt that key using asymmetric encryption.

Browsers for instance only use RSA for the initial key exchange, and then fall back to using AES or whatever is supported by both ends. Your https sessions use RSA for about 80 bytes of data exchange before the web server actually starts communicating with the client, your GET / request

It's a case of "damned if they do and damned if they don't" for Apple currently.

This is precisely what happens when you turn yourself into an "evil" company like Sony did and Apple are a long way through the process of doing - you will attract the hacker community and there will be thousands of people simultaneously trying to shame that company.

It's "infinite monkeys & infinite typewriter" syndrome - the majority of hackers will have no success with breaking into the systems or devices, but because there's *THAT MANY* doing it *ALL OF THE TIME*, eventually some will be successful.

As someone who works in security, I can tell you honestly that no company reveals successful or failed hack attempts on their systems unless they really have to - in the case of the Sony credit cards, they *HAD* to because of the potential fraud on those cards that could take place.

So you can pretty much guarantee that Sony, Apple and other "Evilcorps" are being pounded & hacked all of the time, but they hush it all up as best they can.

Shut up with the Evil Company scare tactics already. They are a company, they are trying to make money, serve customers AND protect their brand. Put those all in the same bowl, mix well, and then tell me if some compromises aren't necessary?

And I would also make the assertion that not only "Evilcorps" are hacked, but charities, squeaky clean companies, and little saintly grey haired grandmas are hacked. Apple/Sony/etc. aren't hacked because they are evil, they are hacked because they exist at all.

Maybe not. But they were summoned to the US Senate [ethicalinvestigator.com] to answer questions on privacy concerns over what they track & why they track it unencrypted.

Google, who is responsible for Android, was also called to those hearings. Apple sent a vice-president in charge of software development. Google sent a lobbyist. Apple voluntarily has already taken steps, and has promised to take further steps [securityweek.com], to reduce both the amount of "tracking data", and to encrypt what data the user's phone does store. What has Google done/promised (I honestly don't know on that one)? But don't let facts available for nearly two months stop your rant.

Apple doesn't actively prohibit "rooting" of their devices.

I think you need to read the last 2 lines about possibly denying sevice on this page [apple.com].

Just because you select a list of reasons why Apple are not evil does not mean they are not evil in other ways.

He's still right. Apple is very good at protecting their business and has a very clear vision how things should work and an unbending will to see it through, but I can't see anything genuinely evil here. In fact lots of things Apple does are rather considerate and cautios. iTunes has DRM but still you can install the apps and music you bought on all iPads and iPhones and iPods you may own. Same wit

Partially true. Apple did say this, and a Federal Court disagreed. Apple however, didn't appeal the decision, and unlike many Android device manufacturers, has not done an end-run around that decision by putting "fuses" in their microcontrollers, signed bootloaders, etc.

So, it seems that Apple had one opinion, and the Feds had another, but in the end, Apple respected the process. It sure seems like those other manufacturers are simply taking a disingenuous advantage of the fact that the lawsuit didn't name them, specifically, and that Android users (and curiously, the EFF) seem to be disinterested in pursuing the issue. Wonder why? Could it be that the EFF has an Anti-Apple bias? Nah, couldn't be!

Apple doesn't infest its products with an OS (Windows 7) that has DRM from the driver-level up.

Wow! Old story much?!? How long did you have to search for that one!?!

If you look at the article, you will see that that referred to the DEVELOPER PREVIEW PLATFORMS when Apple did the Intel Switch. The TPR protection did NOT make it into the actual RELEASE CODE. Obviously, Apple had a pretty strong interest in keeping their very-restricted Beta release OS protected. Let's see what that actually ended up being in the RELEASE code. A simple deleteable file and deletable kernel extension that says "Please Don't Steal OS X". Wow. Some DRM! This article [osxbook.com] refers to TPR on OS X as "The Myth That Won't Die." And of course, the very existence of Hackintoshes kinda belies strong TPM protection, doesn't it?

As I said: DISinformative. But his post is modded +5 Informative, and mine will be punish-downmodded, of course.

Osx is locked using drm to prevent it running in a virtual enviroment (Which really sucks for developers)

It is? You mean the single file "/Volumes/Mac OS X Install DVD/System/Library/CoreServices/ServerVersion.plist" that has to exist before Parallels or VMWare will consent to installing MacOS X in a VM? The file that VirtualBox last I checked didn't care about at all? While Apple actively declares in their EULA that MacOS X (The client version, the server version EULA contains no such specific requirement) can not be installed in a VM, the actual prevention of it is being done purely on the VM side in my e

Apple tried very hard to prosecute people who develops and performs jailbreaks but where shot down by the courts. They also issue dmca takedown notices to any hacker community who would have the balls to inform people how to install or virtualize osx on a pc (Which is a 100% pure drm stye lockdown as a modern mac IS a high spec pc) regardless of wether they want to buy the software.

First, Apple had one opinion, the EFF had another. The Feds sided with the EFF. However, since then, Apple hasn't tried to do an end-run around that decision, like many Android Device manufacturers. No "fuses" in microcontrollers. No encrypted bootloaders. In short, no REAL effort to stop Jailbreaking. In the end, Apple respected the adversarial process. Doesn't make them evil. At all. In fact, quite the opposite.

As far as their prohibition against virtualizing OS X: As Apple has stated many, many, many times, they are a HARDWARE company. That is unabashedly they claim to make their money. Not from the sale of OS X. So, their prohibition against virtualizing OS X on non-Apple hardware is exactly in concert with their prohibition against installing it directly on non-Apple hardware. Their OS. Their rules. Doesn't make them evil, though. Just protecting their primary revenue stream, which is the sale of HARDWARE.

Besides, as pointed out in this article [redmondpie.com], it is quite possible to install OS X on, for example VMWare running under Windows 7, just like it is quite simple to install OS X on any number of hardware-compatible non-Apple computers. Apple says "Please". It does NOT run around like the Artist Now Again Known as Prince, (or the widow of Frank Zappa!), filing DMCA takedown notices of Hackintosh websites, or articles like the one above regarding installing OS X (illegally) on VMWare Server on Windows 7, let alone prosecute anyone who attempts to do so. Illegally.

> Apple doesn't embrace DRM every day, and in every way

Osx is locked using drm to prevent it running in a virtual enviroment (Which really sucks for developers),

No it isn't. See above.

and iPod is most certainly an attempt of a locked in device that uses both drm and propriatary formats to faux competitive mp3 players. Only the competition forced them to abandon this strategy.

Anyone can CLAIM anything without proof. But I DO know that NOBODY forces Steve Jobs to do ANYTHING. And least of all, write an Open Letter decrying DRM, like this [engadget.com].

> Apple doesn't infest its products with an OS (Windows 7) that has DRM from the driver-level up.

Ehh..What do you mean? And how does that compare to sony anyway???

> Now, let's compare the above to Sony.......

How does it compare to Sony? Sony COULD install Linux on its machines (Apple doesn't count; because they have created their own OS). But instead, they have embraced Vista, and then Windows 7. I can't find the article now, but both have so much DRM that, even after Vista shipped (which was LONG after there was a "driver stable" version available for developers) ATi couldn't even write a damned video card driver! I guess

Are you actually DEFENDING Sony's rootkits HERE, on Slashdot?!? Wow! No wonder you posted AC!!!

No, he was just saying that you can't congratulate Apple on not doing something that they couldn't do anyway, in the same way that you couldn't sensibly praise Google for not using WMDs on Martian babies (yet).

Not really. They tried to create their own modern OS in the late 90's. Finally after spending many millions on the project, they gave up and allowed themselves to be taken over by NeXT instead. Then they slapped their gui-paint layer on top of UNIX like some fat chick going to the disco slaps pasty makeup on.

Since NeXTStep was already a GUI-based BSD/Mach "UNIX", what you REALLY mean is that Apple applied some cold-cream, wiped off the NeXT makeup, and THEN slapped on Mac makeup, LOL!

While I admit that that was the original plan, things didn't exactly work out that way [roughlydrafted.com]...
At least Apple was willing to accept that they couldn't realize their overly-ambitious Rhapsody/Copland "Red Box, Blue Box, Yellow Box" OS. But even then, they were able to back-port much of that development back into MacOS 8 and 9, and even

Uh... there is? At least on my iPhone there is. After the 5th attempt it makes you wait increasingly long between each further attempt. By the time you're up to 8 or 9 attempts you're waiting hours. On the 10th or 11th, it wipes the phone completely.