Origin and source of mysterious "unflod" app remain unknown.

Security researchers have uncovered an active malware campaign in the wild that steals the Apple ID credentials from jailbroken iPhones and iPads.

News of the malware dubbed "unflod," based on the name of a library that's installed on infected devices, first surfaced late last week on a pair of reddit threads here and here. In the posts, readers reported their jailbroken iOS devices recently started experiencing repeated crashes, often after installing jailbroken-specific customizations known as tweaks that were not a part of the official Cydia market, which acts as an alternative to Apple's App Store.

Since then, security researcher Stefan Esser has performed what's called a static analysis on the binary code that the reddit users isolated on compromised devices. In a blog post reporting the results, he said unflod hooks into the SSLWrite function of an infected device's security framework. It then scans it for strings accompanying the Apple ID and password that's transmitted to Apple servers. When the credentials are found, they're transmitted to attacker-controlled servers.

In an e-mail to Ars, Esser said the malicious code works only on 32-bit versions of jailbroken iOS devices. "There is no ARM 64-bit version of the code in the copy of the library we got," he wrote. "This means the malware should never be successful on [the] iPhone 5S/iPad Air or iPad mini 2G."

reddit readers said unflod infections can be detected by opening the SSH/Terminal and searching the folder /Library/MobileSubstrate/DynamicLibraries for the presence of the Unflod.dylib file. Compromised devices may possibly be disinfected by deleting the dynamic library, but since no one so far has been able to figure out how the malicious file is installed in the first place, there's no guarantee it won't somehow subsequently reappear.

"That is why we recommend to restore the device," Esser told Ars. "However, that means people will lose their jailbreak until a new one is released, and the majority of jailbreak users will not do that."

Of course, whichever course of disinfection users of infected devices choose, they should also change their Apple ID password as soon as possible.

"I will also again take this moment to point out to anyone concerned that the probability of this coming from a default [Cydia] repository is fairly low," Cydia developer Jay Freeman, aka Saurik, wrote in one reddit comment. "I don't recommend people go adding random URLs to Cydia and downloading random software from untrusted people any more than I recommend opening the .exe files you receive by e-mail on your desktop computer."

Promoted Comments

Now that iPhones are so easy connect to any network you want without having to jailbreak what is the motivation to jailbreak? Is it really that expensive to buy the real version of any iOs software? Is the risk to security really worth any of the little novelty apps that you can't find on the app store? I really am curious.

I have a feeling it is just about tinkering more than any actual great need not being met.

Let's see, on my jailbroken iPhone I have (among other things):

* A commercial caller ID app that I paid for that will do lookups of incoming phone numbers, so that when a number calling me isn't found in my contact list I still have a good chance of seeing who is trying to call me.

* A wifi analyzer to help me troubleshoot wireless networks at work, home, and for friends/family.

* Another commercial app that I paid for the lets me back up all my applications & packages, including those installed through cydia.

NONE of these would be allowed in Apple's App Store, yet they provide a nice level of productivity for me and others. Is it really so hard to comprehend that there are useful apps out there that Apple simply refuses to allow in their App Store for one reason or another?

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

The problem is, no OS is perfect. Even today, malformed data can be forced to execute as code. If the OS only runs signed code, you get a bit (bad cow pun) more protection. It is not perfect, however, it does help.

Just because an app is signed doesn't mean it's any more "protected" than an unsigned app. All it means is that Apple did some bit of testing and determined that as documented the app fits within their guidelines.

Case in point: An app called iRandomizer Numbers was submitted to the App Store back in 2012 and approved by Apple. It purported to be nothing more than a random number generator. But word eventually got out that it had an undocumented back door that turned it into a tethering app. Once Apple learned of this they pulled it from the App Store. This is just one example of many apps that made it into the App Store that had undocumented & hidden features. Whose to say that any of the apps you currently have on your iPhone doesn't have similar hidden functionality, either malevolent or benevolent?

This kind of stuff is that I was paranoid about when I jailbroke, and why I eventually stopped doing it.

The problem is less the jailbreaking and more the "running random programs from random websites". People still largely don't seem to see smartphones as computers that can take calls, rather than phones that do neat things.

I know that nothing is eternally impervious, but for now I'm really glad to be in a walled garden.

Whereas I prefer to live in a wide open world, and just take security seriously.

Different use cases. Whereas I take security seriously too but have other things I need to do with my time too. It's all about priorities and balance. I've found what works for me and apparently you have too. Good luck!

I know that nothing is eternally impervious, but for now I'm really glad to be in a walled garden.

Whereas I prefer to live in a wide open world, and just take security seriously.

Different use cases. Whereas I take security seriously too but have other things I need to do with my time too. It's all about priorities and balance. I've found what works for me and apparently you have too. Good luck!

I'm a big Apple fan, but keep in mind that the "Google way" in this is a pretty attractive middle ground. Many (most?) people get all their apps from Google Play. If you do that, then it works pretty much like Apple's walled garden, with the one difference being that Google doesn't curate the apps in as much detail (this is independent of the security model at the device level, btw). But in the Google system, someone who wants other sources of apps has that option with pretty much just the flip of a switch.

Many jailbreakers do treat their equipment like phones that do neat things rather than computers that can take calls. Not all of them to be sure but many if not most of them. If the majority of people truly understood how to secure their own systems properly companies like AVG, Mcafee, Norton, and Panda probably would be less successful. Instead they want this or that extra feature even if it puts them at greater risk.

Now that iPhones are so easy connect to any network you want without having to jailbreak what is the motivation to jailbreak? Is it really that expensive to buy the real version of any iOs software? Is the risk to security really worth any of the little novelty apps that you can't find on the app store? I really am curious.

I have a feeling it is just about tinkering more than any actual great need not being met.

This kind of stuff is that I was paranoid about when I jailbroke, and why I eventually stopped doing it.

The problem is less the jailbreaking and more the "running random programs from random websites". People still largely don't seem to see smartphones as computers that can take calls, rather than phones that do neat things.

This. Consumer computers, all of them no exceptions, should only be able to run signed code. If you've got computer illiterate friends and family and they want to cling to what they know which is probably going to be Windows and Office, get them to buy an RT device. They won't be able to screw it up. Or if they like Apple, an iPad.

This kind of stuff is that I was paranoid about when I jailbroke, and why I eventually stopped doing it.

The problem is less the jailbreaking and more the "running random programs from random websites". People still largely don't seem to see smartphones as computers that can take calls, rather than phones that do neat things.

This. Consumer computers, all of them no exceptions, should only be able to run signed code. If you've got computer illiterate friends and family and they want to cling to what they know which is probably going to be Windows and Office, get them to buy an RT device. They won't be able to screw it up. Or if they like Apple, an iPad.

I'm certainly fine with that being an option, but I run lots of programs that e.g. Apple would never approve of. I like to emulate old games that you can't really buy anymore. On a phone or a laptop, it has the added value of being portable, which many old consoles weren't. But Apple specifically will not approve of emulators in the app store. What if I want to make my own programs?

Maybe a jailbreaker can comment on why they do it - at this point in time apple pretty much has added all the features I wanted that android had.

The only reason I'd root android is to have root call blocker installed to block calls via wildcards. And maybe to force tethering to work regardless of what the carrier says .

Either that or to get by ridiculous restrictions that manufacturers will impose on their phones. The LG Optimus F3 is a great example of this. The idea that you couldn't load apps to the SD card when the user storage is so pitiful that it bitches whenever you try to update the apps that came with it is insane.

Thankfully my girlfriend bought the phone simply as a temporary measure until she bought an iPhone, but it's crap like this that makes you understand why people root their phones. It's also things like this that don't do LG or the Android brand in general any favors.

Maybe a jailbreaker can comment on why they do it - at this point in time apple pretty much has added all the features I wanted that android had.

The only reason I'd root android is to have root call blocker installed to block calls via wildcards. And maybe to force tethering to work regardless of what the carrier says .

Either that or to get by ridiculous restrictions that manufacturers will impose on their phones. The LG Optimus F3 is a great example of this. The idea that you couldn't load apps to the SD card when the user storage is so pitiful that it bitches whenever you try to update the apps that came with it is insane.

Thankfully my girlfriend bought the phone simply as a temporary measure until she bought an iPhone, but it's crap like this that makes you understand why people root their phones. It's also things like this that don't do LG or the Android brand in general any favors.

If this is true, according to one page it only has 1.2gB of free space for user stuff. That's useless for today's stuff. But I only use 300mB myself so I don't know. It probably works for the target market, a person who needs a phone to do nothing

If the phone could use the sd card as ext4 there wouldn't be an issue as it has to enforce security and it can't with fat32.

I think running signed software is an ideal factory default. I think in many cases you should be able to choose to ignore that default but if you do you she have an understanding of the risks in doing so.

Maybe a jailbreaker can comment on why they do it - at this point in time apple pretty much has added all the features I wanted that android had.

The only reason I'd root android is to have root call blocker installed to block calls via wildcards. And maybe to force tethering to work regardless of what the carrier says .

Either that or to get by ridiculous restrictions that manufacturers will impose on their phones. The LG Optimus F3 is a great example of this. The idea that you couldn't load apps to the SD card when the user storage is so pitiful that it bitches whenever you try to update the apps that came with it is insane.

Thankfully my girlfriend bought the phone simply as a temporary measure until she bought an iPhone, but it's crap like this that makes you understand why people root their phones. It's also things like this that don't do LG or the Android brand in general any favors.

If this is true, according to one page it only has 1.2gB of free space for user stuff. That's useless for today's stuff. But I only use 300mB myself so I don't know. It probably works for the target market, a person who needs a phone to do nothing

If the phone could use the sd card as ext4 there wouldn't be an issue as it has to enforce security and it can't with fat32.

Don't get me started... I'd practically like to nuke LG for that as it stands.

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

Maybe a jailbreaker can comment on why they do it - at this point in time apple pretty much has added all the features I wanted that android had.

The only reason I'd root android is to have root call blocker installed to block calls via wildcards. And maybe to force tethering to work regardless of what the carrier says .

I always root my devices and I think we users should demand that a root/jailbreak/boot loader unlock mechanism comes built-in in all the devices which we buy, as opposed to devices which we would rent.

If I buy a house I may not want to renovate everything in it, but I expect to have an option to do it, should I ever want to. Locked up devices are like a permanent hotel. They just don't feel like home to me.

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

It's fun to think we're "too smart to fall for that" , but the fact is that we still don't know how the infections work in the first place, there's every chance it was something like an infected ad on a common website , and no amount of "knowing what you're doing" would keep you safe.

The smart thing to do is not take down one of the main security features of the OS in the first place, and if the excessive security offends your inner rebellious tinkerer, then iOS obviously isn't the right os for you . I mean no offense and am not trying to be snarky , it's good that different OS's offer different experiences to users, different strokes for different folks and all that.

Maybe a jailbreaker can comment on why they do it - at this point in time apple pretty much has added all the features I wanted that android had.

The only reason I'd root android is to have root call blocker installed to block calls via wildcards. And maybe to force tethering to work regardless of what the carrier says .

First off, I did not do anything wrong, the prosecution had it in for me from the beginning. Secondly, I was facing 20 years in federal with no possibility of time off. Thirdly.... Wait, what were we talking about?

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

The problem is, no OS is perfect. Even today, malformed data can be forced to execute as code. If the OS only runs signed code, you get a bit (bad cow pun) more protection. It is not perfect, however, it does help.

Depending on the OS to be secure is a false sense of security. iOS can be compromised (that's how jail breaking is possible!). Linux is not perfectly secure either. I don't rely on the OS to be secure; I rely on best practices to be (relatively) secure. I pay attention to what my devices are doing, so that I can be proactive, rather than reactive about security. In 20 plus years of computer use, spanning every major OS and mobile OS, I've never had a machine compromised, nor any accounts hacked. Not lucky, just vigilant.

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

The problem is, no OS is perfect. Even today, malformed data can be forced to execute as code. If the OS only runs signed code, you get a bit (bad cow pun) more protection. It is not perfect, however, it does help.

There exists no operating system which cannot be infected by having the user run a malicious program on it. How could there be? The difference between a malicious and benign program is not the basic operations it performs, but why it does them. Take a game: If it saves data then it needs to have read, write, modify abilities, and probably also delete files. But all of those operations could also be used by a malicious program (read your data and send it to others, write junk data to fill your HDD, modify key system files, or just delete whatever). So either you make it so that no program can do anything useful, or you allow for malicious programs to exist.

Edit: To clarify: Obviously steps can be taken to mitigate the harm a malicious program can do, but short of rendering an OS functionally useless you can't prevent all malicious actions. If nothing else, you can't protect against actions that are only malicious in certain contexts if they're legitimate in others.

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

The problem is, no OS is perfect. Even today, malformed data can be forced to execute as code. If the OS only runs signed code, you get a bit (bad cow pun) more protection. It is not perfect, however, it does help.

There exists no operating system which cannot be infected by having the user run a malicious program on it. How could there be? The difference between a malicious and benign program is not the basic operations it performs, but why it does them. Take a game: If it saves data then it needs to have read, write, modify abilities, and probably also delete files. But all of those operations could also be used by a malicious program (read your data and send it to others, write junk data to fill your HDD, modify key system files, or just delete whatever). So either you make it so that no program can do anything useful, or you allow for malicious programs to exist.

It is one thing to get bad code to execute once. It is another thing to get persistent code onto the computer. Lets say, you require all files containing executable code to be cryptographically signed before the OS loads them. You could mess with code already loaded. You could get that code to modify a file in permanent storage (assuming the program had the right to read and write that file.) If you do modify an executable file, it would no longer match it's cryptographic signature and the OS would reject it.

As long as the boot loader is read only and all OS files are signed, it would be very hard to mod the OS to load non signed code. This will probably be the way jailbreaking will be stopped in the future.

Depending on the OS to be secure is a false sense of security. iOS can be compromised (that's how jail breaking is possible!). Linux is not perfectly secure either. I don't rely on the OS to be secure; I rely on best practices to be (relatively) secure. I pay attention to what my devices are doing, so that I can be proactive, rather than reactive about security. In 20 plus years of computer use, spanning every major OS and mobile OS, I've never had a machine compromised, nor any accounts hacked. Not lucky, just vigilant.

The best security is security in layers. Every time you open up a web browser and go to a site, your computer is executing code written by people you do not know or trust. (Can you say, JavaScript adds? I knew you could.) Start with a good OS built around security (Not running as root!). Next up, put good AV on it. Then, be vary careful about what code you choose to run. Back everything up and always assume, you have been compromised.

This kind of stuff is that I was paranoid about when I jailbroke, and why I eventually stopped doing it.

The problem is less the jailbreaking and more the "running random programs from random websites". People still largely don't seem to see smartphones as computers that can take calls, rather than phones that do neat things.

This. Consumer computers, all of them no exceptions, should only be able to run signed code. If you've got computer illiterate friends and family and they want to cling to what they know which is probably going to be Windows and Office, get them to buy an RT device. They won't be able to screw it up. Or if they like Apple, an iPad.

I'm certainly fine with that being an option, but I run lots of programs that e.g. Apple would never approve of. I like to emulate old games that you can't really buy anymore. On a phone or a laptop, it has the added value of being portable, which many old consoles weren't. But Apple specifically will not approve of emulators in the app store. What if I want to make my own programs?

If you want to write your own iOS programs, get XCode, I think it's $5 and you can write as many apps as you want without jailbreaking. If you want to install them on a bunch of devices, get a developer account for $99/year and you can distribute to up to 99 people without using the store.

For almost everyone else, Apple has worked hard to make iOS the most secure consumer operating system around. Obviously it starts with industrial strength Unix, with all the standard memory protection/multitasking/etc. Then they require apps to be signed before they can be installed, and before you can widely distribute Apple App review vets your app fairly rigorously (I say this as a developer who rang up roughly 20 rejections last year). Finally, Apple has a built in OS updating system that enables them to push fixes out that a majority of iOS users install them within days, and roughly 90% within a month or so.

That's 4 layers that work together very well and why AntiVirus/AntiMalware has never been needed on iOS. Even if someone is able to sneak malware through App Review, Apple can quickly revoke their certificate. But these layers don't work if you jailbreak. Apple has done well the last few years to make jail-breaking less attractive, but will never be able to eliminate it.

Google superficially has similar layers in Android, but they don't work as effectively, and partially by design. Google wants Android to be open, so they are philosophically opposed to the level of control Apple imposes. Google does less vetting of app submissions to Google Play, which means it's easier/faster to distribute on Android but that there has also been a bunch of malware on their store. Google also isn't able to force Android updates through the carriers.

iOS is certainly more secure than Android, but Android is still a big improvement on desktop operating systems like Windows & OSX that don't impose these layers as strongly.

I'm torn on the jailbreaking phenomenon. I can understand wanting a way to install your own applications without Apple signing them off. If I were to decide, I think I'd leave the default as it is but add some obnoxiously complicated way to install your own apps, with lots of warnings that this is not supported in any way shape or form - probably only allow installs from inside iTunes on a computer, with an enforced backup routine to enable it. This would remove a lot of the justifications for jailbreaking, but it would not end the phenomenon, because there will always be people who want to modify the OS in some way to change the color of the Safari icon or whatever.

This kind of stuff is that I was paranoid about when I jailbroke, and why I eventually stopped doing it.

The problem is less the jailbreaking and more the "running random programs from random websites". People still largely don't seem to see smartphones as computers that can take calls, rather than phones that do neat things.

The entire point of jail breaking is to allow running random programs from random websites.

And really it's worse than a PC. Desktop PC operating systems are designed to allow running random software from unknown sources, and they have some built in protection. It's not good protection but it's better than nothing. iOS assumes all software (except stuff that the user compiled with Xcode to run on their own device) has been authorised/screened by Apple and can be retro-actively blacklisted by Apple if it somehow gets through the authorisation. So you have no security at all once you jail break.

You can run as a non-admin/root user on Windows, Apple enforces this on OS X — you can't log in as root. No such protection on iOS, if it's jail broken you are wide open.

I jailbreak to do things that Apple doesn't allow, specifically I use the fingerprint sensor globally, not just for unlocking the device. I observe sensible security practices AND I know what I'm doing. I also root my Android devices, for the same reasons.

The problem is, no OS is perfect. Even today, malformed data can be forced to execute as code. If the OS only runs signed code, you get a bit (bad cow pun) more protection. It is not perfect, however, it does help.

There exists no operating system which cannot be infected by having the user run a malicious program on it. How could there be? The difference between a malicious and benign program is not the basic operations it performs, but why it does them. Take a game: If it saves data then it needs to have read, write, modify abilities, and probably also delete files. But all of those operations could also be used by a malicious program (read your data and send it to others, write junk data to fill your HDD, modify key system files, or just delete whatever). So either you make it so that no program can do anything useful, or you allow for malicious programs to exist.

It is one thing to get bad code to execute once. It is another thing to get persistent code onto the computer. Lets say, you require all files containing executable code to be cryptographically signed before the OS loads them. You could mess with code already loaded. You could get that code to modify a file in permanent storage (assuming the program had the right to read and write that file.) If you do modify an executable file, it would no longer match it's cryptographic signature and the OS would reject it.

As long as the boot loader is read only and all OS files are signed, it would be very hard to mod the OS to load non signed code. This will probably be the way jailbreaking will be stopped in the future.

Even that's not foolproof. Such systems have been broken in the past, and will doubtless be broken in the future. Especially given how flawed PKI is right now.

By way of example: Such a system must provide some way for you to import your own certificates as "trusted", unless the OS-manufacturer is going to sign every single version of every program released*. So just trick people into trusting the cert used to sign the malicious app. Remember, we're talking about the situation in which you've already got someone to try and run your code on their machine, at that point one extra step is not much of a hurdle.

Plus, I'd argue that an OS that only runs signed code is well out of the "do anything useful" sphere. At that point you have given up all control over the system to a 3rd party.

*In which case they'll either automate the process, opening it to exploitation, or rapidly go bankrupt from hiring several million software examiners.

I know that nothing is eternally impervious, but for now I'm really glad to be in a walled garden.

Whereas I prefer to live in a wide open world, and just take security seriously.

Different use cases. Whereas I take security seriously too but have other things I need to do with my time too. It's all about priorities and balance. I've found what works for me and apparently you have too. Good luck!

I'm a big Apple fan, but keep in mind that the "Google way" in this is a pretty attractive middle ground. Many (most?) people get all their apps from Google Play. If you do that, then it works pretty much like Apple's walled garden, with the one difference being that Google doesn't curate the apps in as much detail (this is independent of the security model at the device level, btw). But in the Google system, someone who wants other sources of apps has that option with pretty much just the flip of a switch.

Apple has a middle ground too. If you sign up for a developer account, which is easy and reasonably cheap, you're given a certificate that can be used to run arbitrary code on your own devices, just have to sign the binary with your developer certificate.

I have three or four apps on my iPhone that did not get authorised by Apple at all, and I've still got all the security in place protecting me.

I know that nothing is eternally impervious, but for now I'm really glad to be in a walled garden.

Whereas I prefer to live in a wide open world, and just take security seriously.

Different use cases. Whereas I take security seriously too but have other things I need to do with my time too. It's all about priorities and balance. I've found what works for me and apparently you have too. Good luck!

I'm a big Apple fan, but keep in mind that the "Google way" in this is a pretty attractive middle ground. Many (most?) people get all their apps from Google Play. If you do that, then it works pretty much like Apple's walled garden, with the one difference being that Google doesn't curate the apps in as much detail (this is independent of the security model at the device level, btw). But in the Google system, someone who wants other sources of apps has that option with pretty much just the flip of a switch.

Apple has a middle ground too. If you sign up for a developer account, which is easy and reasonably cheap, you're given a certificate that can be used to run arbitrary code on your own devices, just have to sign the binary with your developer certificate.

I have three or four apps on my iPhone that did not get authorised by Apple at all, and I've still got all the security in place protecting me.

To me, that is the best approach.

Perhaps. Certainly I'm glad that that works for you, I am fundamentally opposed to the idea of having to ask permission and pay a fee to run my own software on my own machine. That iOS doesn't have a 'run unsigned code' option without jailbreaking is, to me, philosophically objectionable. But, that's why I don't own any iStuff.

I know that nothing is eternally impervious, but for now I'm really glad to be in a walled garden.

Whereas I prefer to live in a wide open world, and just take security seriously.

Different use cases. Whereas I take security seriously too but have other things I need to do with my time too. It's all about priorities and balance. I've found what works for me and apparently you have too. Good luck!

I'm a big Apple fan, but keep in mind that the "Google way" in this is a pretty attractive middle ground. Many (most?) people get all their apps from Google Play. If you do that, then it works pretty much like Apple's walled garden, with the one difference being that Google doesn't curate the apps in as much detail (this is independent of the security model at the device level, btw). But in the Google system, someone who wants other sources of apps has that option with pretty much just the flip of a switch.

I'm boggled at the down votes here. On Android, you can use at least three app stores. You take your chances in all app stores, so I don't see why the Apple walled garden is superior.

Is freedom such a bad thing?

The real problem I see here is the need to jail break the iphone at all. That is where your security model fails. Just let the user buy from whomever they want.

I'm torn on the jailbreaking phenomenon. I can understand wanting a way to install your own applications without Apple signing them off. If I were to decide, I think I'd leave the default as it is but add some obnoxiously complicated way to install your own apps, with lots of warnings that this is not supported in any way shape or form - probably only allow installs from inside iTunes on a computer, with an enforced backup routine to enable it. This would remove a lot of the justifications for jailbreaking, but it would not end the phenomenon, because there will always be people who want to modify the OS in some way to change the color of the Safari icon or whatever.

You think as if you own your iphone and can so anything you want with it!

Reasons for jailbreaking:* Tethering without the network deciding for you (though it's interesting that Android has a native tethering function which the network has no control over)* Adblock* Installing apps that Apple will never approve of - anything tangentially related to BitTorrent, even a remote Transmission client is verboten for some crazy reason.* Being able to develop and test apps on your own device without paying a fee* Customisations, customisations, customisations, though I don't personally dabble too much here.

Funny really, you can accomplish almost all of the above with Android without rooting the device.

I'm boggled at the down votes here. On Android, you can use at least three app stores. You take your chances in all app stores, so I don't see why the Apple walled garden is superior.

Is freedom such a bad thing?

The real problem I see here is the need to jail break the iphone at all. That is where your security model fails. Just let the user buy from whomever they want.

Which leads us down the path of the user having to make trust decisions without being in a position to do so. Inevitably this leads to the exploitation of that trust, people becoming infected, iOS botnets, and the need for constantly updating A/V programs.

We've seen from years of Windows viruses that the general public aren't equipped to make good decisions about trust. They see a good offer that they can't miss and take it. Its not just a case of having a default that doesn't allow signed curated code to be run because enough people would be susceptible to a social engineering attack to turn it off to take advantage of the can't miss offer.

I know that I could make the security but to be honest that's not the point. Think of all of the waste that goes on because of infected computers. Should we accept the same on phones/tablets?

I think what this argument always boils down to is this: Do you view a smartphone as a "universal computer" or as a convenient appliance? In the first case "I want to do what I want with a machine that I bought and will care for my own security, thank you very much!" is perfectly valid. In the second case just outsourcing as much vetting and security as you can is perfectly valid and the restrictions that come with this don't apply anyway.

The real idiots are those who sit squarely in one of those and can't understand anyone who's in the other.