I'm a technology, privacy, and information security reporter and most recently the author of the book This Machine Kills Secrets, a chronicle of the history and future of information leaks, from the Pentagon Papers to WikiLeaks and beyond.
I've covered the hacker beat for Forbes since 2007, with frequent detours into digital miscellania like switches, servers, supercomputers, search, e-books, online censorship, robots, and China. My favorite stories are the ones where non-fiction resembles science fiction. My favorite sources usually have the word "research" in their titles.
Since I joined Forbes, this job has taken me from an autonomous car race in the California desert all the way to Beijing, where I wrote the first English-language cover story on the Chinese search billionaire Robin Li for Forbes Asia. Black hats, white hats, cyborgs, cyberspies, idiot savants and even CEOs are welcome to email me at agreenberg (at) forbes.com. My PGP public key can be found here.

iPhone Security Bug Lets Innocent-Looking Apps Go Bad

Apple’s iPhones and iPads have remained malware-free thanks mostly to the company’s puritanical attitude toward its App Store: Nothing even vaguely sinful gets in, and nothing from outside the App Store gets downloaded to an iOS gadget. Now serial Mac hacker Charlie Miller has found a way to sneak a fully-evil app onto your phone or tablet, right under Apple’s nose.

At the SysCan conference in Taiwan next week, Miller plans to present a method that exploits a flaw in Apple’s restrictions on code signing on iOS devices, the security measure that allows only Apple-approved commands to run in an iPhone or iPad’s memory. Using his method–and Miller has already planted a sleeper app in Apple’s App Store to demonstrate the trick–an app can phone home to a remote computer that downloads new unapproved commands onto the device and executes them at will, including stealing the user’s photos, reading contacts, making the phone vibrate or play sounds, or otherwise repurposing normal iOS app functions for malicious ends.

“Now you could have a program in the App Store like Angry Birds that can run new code on your phone that Apple never had a chance to check,” says Miller. “With this bug, you can’t be assured of anything you download from the App Store behaving nicely.”

Miller, a former NSA analyst who now works as a researcher with consultancy Accuvant, created a proof-of-concept app called Instastock to show the vulnerability. The simple program appears to merely list stock tickers, but also communicates with a server in Miller’s house in St. Louis, pulling down and executing whatever new commands he wants. In the video above, he demonstrates it reading an iPhone’s files and making the phone vibrate. Miller applied for Instastock’s inclusion in the App Store and Apple approved the booby-trapped app. (Perhaps the company ought to have been more suspicious of an application in Miller’s name, given that he has hackedpracticallyeverydevice Apple has made since 2007 or so.)

Update: A reader points out that Miller’s application has now been removed from the App Store.

I’ve reached out to Apple for comment but haven’t yet heard from the company. Given how seriously this exploit could affect the company’s crown jewels, expect a patch very soon.

Miller became suspicious of a possible flaw in the code signing of Apple’s mobile devices with the release of iOS 4.3 early last year. To increase the speed of the phone’s browser, Miller noticed, Apple allowed javascript code from the Web to run on a much deeper level in the device’s memory than it had in previous versions of the operating system. In fact, he realized, the browser’s speed increase had forced Apple to create an exception for the browser to run unapproved code in a region of the device’s memory, which until then had been impossible. (Apple uses other security restrictions to prevent untrusted websites from using that exception to take control of the phone.)

The researcher soon dug up a bug that allowed him to expand that code-running exception to any application he’d like. “Apple runs all these checks to make sure only the browser can use the exception,” he says. “But in this one weird little corner case, it’s possible. And then you don’t have to worry about code-signing any more at all.”

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

Comments

This type of journalism and way of thinking both bother me. Let me explain.

These smartphones are computing devices with high speed processors and large amounts of memory and storage space. They’re equipped with network adapters that allow them to communicate with various wireless networks, much like the computers you have at your homes and offices. Simply put, they are computers. Even the operating systems are rooted in desktop and server operating systems.

At least since the beginning of the PC era, if not before, the owner of a computer has had the freedom to install software from anywhere he wanted. This is a trait of ownership – the freedom to do with a thing that which one wants. Most people don’t give this much thought, because the smartphone software market is a relatively new development, and many of the market’s users are too young to have experienced anything different. But, if the owner of a computer needs a piece of software, he can buy it at any local or online retailer that he wants. Not only does he have the freedom to purchase from any retailer he wants, he can also install any software he wants without getting permission from the manufacturer of the computer. What if HP required you to purchase software only from them, and censored the software you were allowed to own?

To put this in perspective for those who don’t really get it, imagine if each car manufacturer said you could only buy a certain brand of tire and you could only buy your tires from them? Would you settle for that restriction on your freedom? Of course not, so why do you accept it with your smartphone?

Now, some would say that many of these restrictions are there to please the network providers, e.g. Verizon, Sprint, et al. Let’s assume they’re right and that’s one of the reasons. That’s akin to your internet provider having the final say over what you can or cannot install on your home computer.

Others say it’s good because it reduces the prevalence of spyware and other malware. Aside from the fact that the capabilities of and permissions granted to smartphone apps are worse than most PC spyware, the act of choosing this purported security over freedom is bad for everyone.

Simply put, this way of thinking is the opposite of freedom and this type of journalism helps manufacture consent for the removal of freedoms for computing device owners by framing the context in which people think about the subject. Once people accept the removal of freedom in one area, they’re much more likely to accept it in other areas, and are often even unaware that they’re doing so.

We should all do the right thing and stand up for all of our freedoms, and journalists should be aware and appreciative of this more than most.

Thanks for this comment, mindctrl. I agree that Apple’s restrictions are about control as well as security, and I’m glad that it’s possible to jailbreak an iPhone even though it opens the device to more security vulnerabilities.

I’m not sure what you’re criticizing about this story, though. The kind of exploit Miller is demonstrating, which pulls down new code that neither Apple nor the user can control, isn’t good for either freedom or security.

In fact, Apple users have the right to trade off freedom on their device for protection from malware. Miller is just showing that the tradeoff isn’t so clear. You can lose freedom and still be vulnerable. That’s important.

By the way, for anyone who really wants to retain their freedom instead of complete security, there’s always Android.

My criticism is that these scare tactics, for the lack of a better phrase, are being used fairly frequently in the media to build consent for removal of owner freedoms.

You even say that in the last sentence of your comment by saying, “for anyone who really wants to retain their freedom instead of complete security, there’s always Android.”

Your article refutes this claim of “complete security”, with displaying how the Apple product – with its controlled experience, isn’t “complete security”. It’s a false construct. Jailbreaking isn’t an answer either for those who want freedom, since Apple basically attempts to reclaim ownership of your experience with each software update. It’s all these attempts to frame the thinking around this market so that users will accept this removal of freedom that bothers me.

I don’t like the direction mobile is taking the computing world, and the access most apps have to our information, location, data, etc. is tantamount to spyware. It’s worse than existing desktop computer software, yet it’s billed as more secure.

But I’m just one of the few dissenters that is even aware of all this…

I think its more about how you automatically dismiss an open system, like that implemented in Android, as insecure.

Having an open system isn’t insecure. On android you can always just use Amazon’s App-store, which is filtered in the same way as iOS. So in an open environment you can have both. Android also has more advanced security features in place, like permissions for apps, so that you can be absolutely sure certain apps can’t access certain data.

Really iOS is the only OS in history that could run 3rd party apps, yet locked down to one store. OS X is open, but does that automatically make it insecure? Is every other OS in the world insecure? Of course not. Because a locked down environment doesn’t improve security. Who knows what malware is already present in the App Store? There have been multiple accounts of apps in the iTunes App Store which have hidden features that Apple has missed, including the flash light app that doubled as a free tethering app as a secret feature.

The reason Apple locks iOS to one store has zero to due with security. Its incorrect to assume Apple checks every app, every line of code, they only check that it does what it says and doesn’t compete with them. Assuming a locked down OS is safer is incorrect, and in fact the opening up of the OS would allow the development of more security based apps to investigate these issues, and result in better overall security.

So its very short sighted and incorrect to suggest that iOS being locked down improves security, its unrelated because as your article suggests, Apple can’t really prevent malware from entering the App Store. This means there is no security trade off to an open system, in fact an open system may be safer as there is no false premiss of security.

I too am concerned every time the tides of computing seem to ebb toward, perhaps, despotic standards. And I’m not the first person to note the irony inherent in Apple’s use of communism (in the 80′s) as the enemy of computing. But I didn’t feel that the article implied that all OS’s should be as despotic as Apple’s (or their goal, anyway). Yes, Andy replied that you can turn to Android if you value freedom over security, but simply stated, he is correct. For the average user IOS really is more secure (much as I hate to say so). As an infosec professional I’ve had to concede that my beloved Android device cannot be safely included in our list of allowed devices–for now–at work (though this isn’t necessarily the case for all organizations).

Ultimately, the article seems only to note that IOS isn’t necessarily as locked down as some users might think. And why not note it? Isn’t the price of freedom eternal vigilance?

I personally have no issue with Apple embracing a more restrictive paradigm. Obviously many are happy with it as is, so what is there to say about it. But if we’re going to chastise the OS vendors, why not start with Google–for behaving the Apple way (i.e. during those updates you mentioned) but on a platform that really does purport to be about freedom.

In any case, for those who want a more stability and standardization there’s iOS; for those who crave more freedom there are the alternatives. And viva la liberte.

Personally, I think Miller stepped over the line when he made the application available to end-users on the Apple App store. It is one thing to do research and notify the vendor that there is an issue. It is an entirely different thing to create a program that essentially has botnet or Trojan like capabilities and intentionally release it to the public to prove a point. This activity may very well have violation of United States Code Tile 18 Section 1030. If so, then he may face prosecution.

This is quite revealing about one aspect of Apple’s app store security model: Apple doesn’t assess and probably isn’t concerned about the large amount of information leaked out through its approved apps. Apple will fix the code signing bug that Miller exploited, but I very much doubt that it will fix the underlying problems that come from apps that phone home.

In this new app store world, the only way to counter this dual privacy and security threat is for individuals to install their own whitelisting firewall and manage it, akin to NoScript. This can be managed on the iPhone with Firewall iP (JB required) and on OS X with Little Snitch.

Every app should state up front — both to Apple and the users — which connections it will make and why. Apple could facilitate this by providing these details in the apps specs, and inspecting the claim as part of their approval process.

Based on my own experience watching all attempted outgoing connects from my iPhone via Firewall iP, I know that know one except me is looking after my privacy and security on the device, whether it’s iOS or droid.