Amidst a clash between social networking, advancing technology and privacy issues, a loophole has been discovered that allows any iOS app which has been given access to location data to upload a device's entire photo library.

In a report on Tuesday, The New York Times outlined the issue that allows an app to upload geo-tagged photos from an iPhone, iPad or iPod touch in an unseen background task after said app has received permission from the user to enable location services.

Developers have been aware that the so-called "loophole" has existed for some time now, though until now there has been no proof-of-concept showing the functionality's scope. To that end, The Times asked a developer, who requested not to be named due to his association with a major app company, to write a test program that exploited the purported security weakness.

The code writer created "PhotoSpy" which, once installed and launched on an iPhone testbed, first asks for access to a user's location data. Once granted, the app begins the process of uploading photos and their corresponding location data to a remote server.

“Conceivably, an app with access to location data could put together a history of where the user has been based on photo location,” said app maker Curio's co-founder David E. Chen. “The location history, as well as your photos and videos, could be uploaded to a server. Once the data is off of the iOS device, Apple has virtually no ability to monitor or limit its use.”

Typical Apple safeguards have been to sandbox apps, or limit their access to data and certain system-level iOS functions. When the company's mobile platform changed from "iPhone OS" to "iOS" in 2010, the sandbox grew to include the photo library among other system services.

Apple still has theoretical control over possibly malicious apps as software that reaches the App Store must first pass through the company's vetting process. However, the company has shown some missteps recently as the number of apps being submitted to the store is swelling.

An example would be the fake "Pokemon" app that made it into the App Store and reached top-five most-downloaded status before being pulled.

In attempts to streamline user experience and make a more cohesive product, developers have been looking for ways to consolidate data behind the scenes though some of their efforts border on invasion of privacy.

Earlier in February, the social networking app "Path" came under fire for uploading users' addresses in a background task in a reported attempt to make connecting with other friends using the program a seamless process. Customer backlash prompted the developer of the program to institute an opt-in requirement and issue an apology.

“Apple has a tremendous responsibility as the gatekeeper to the App Store and the apps people put on their phone to police the apps,” said David Jacobs, a fellow with the Electronic Privacy Information Center. “Apple and app makers should be making sure people understand what they are consenting to. It is pretty obvious that they aren’t doing a good enough job of that.”

The photo library has always been readable. What the guy has done has written an app that makes use of the background location mode that enables the app to run in the background constantly (although at a significant battery cost), and monitors the photo dir for new photos.

It's pretty convoluted to say by uploading the photos you can extract the location information from the geotagged photos. Since the app is runing in background location mode, it is SUPPOSED to have access to the location and can simply upload that instead.

I've turned off all location services and turn them on when I need to use TomTom and maps. One thing that I've noticed, though, is that the Camera App has shown the little location arrow and has disappeared right away. I'm going to have to check pictures' metadata to see if the camera is still adding locations to the pictures.

If this does allow 3rd-party apps unfettered access to personal photos and images then this should be locked down. Despite what some are saying "personal responsibility" shouldn't including being able to understand the codebase of a 3rd-party app before using.

Quote:

Originally Posted by ChristophB

Troll bait engaged!

They are out in force in this thread.

This bot has been removed from circulation due to a malfunctioning morality chip.

"Lock it down"
Ummmm... sounds like you should run as fast as you can to get an Android "open system" with it's record of flakey security, viruses, and near imposable to follow versions and upgrade path. Leave Apple to their obsessive concerns about user privacy and security. One man's lock down is another's securty. I'd rather my stuff was not circulated around Legos, Beijing and Brooklyn. Apple finds bugs & flaws then they fix 'em.

Here's yet another example of our misguided and utterly useless focus on the quaint notion of "privacy." Privacy, always of questionable value, is now as obsolete as whale-boned corsets, and like such lingerie, is at this late date appropriately of concern only to historians, antiquarians, and fetishists.

What we should be talking about is ownership. We should be fighting to ensure that all data about an individual is legally that individual's property, that any entity that uses such data without recompense to the owner is legally a thief, and that any entity that sells an allegedly secure product that turns out not to be is liable for damages.

If the manufacturer of the lock on your front door turns out to have keyed it identically to ten thousand other units, are you going to face the resultant theft of everything you own and wonder who's responsible? Of course not. It's obvious who's responsible for your loss, and there are legal structures in place to help you be compensated.

Not so, software. The software EULA is the apotheosis of the old Lilly Tomlin Ma Bell joke: "We're the phone company We don't care. We don't have to." When it comes to code, the "kcuf you" goes in before the name goes on. Why do we allow companies to disavow all responsibility for a product merely because that product is intangible?

If you're dumb enough to buy a car whose manufacturer tells you up front that it will not be held responsible when its defective gas tank causes it to explode, you can still cry on my shoulder when your entire family is killed, but please don't speak: I don't want to be tempted to tell you what I'm thinking while you're crying.

How about giving people the freedom to do what they want with their devices? How about personal responsibility instead of relying on big brother?

Remind me to ask you about your personal responsibility when your car is stolen because its manufacturer just kinda sorta slipped up and keyed every vehicle in a run of five hundred thousand identically.

Never mind You're clearly a smart guy. I'm sure the first thing you do after buying a car is change the locks.

It's time we stopped whimpering about our rights and understood that anyone who buys a software-based product without first becoming an engineer, then committing industrial espionage to acquire the source code, and finally proofing a billion lines of abstruse gibberish is getting just what he or she asked for.

I've worked with the Photo Library APIs. This is exactly how it is intended to work. The protections are there to protect the location data, not the photos themselves. I've always found it interesting that the location data would deserve protection but not the photos themselves (only indirectly because they contain location data), but again this is not how the system was designed.

Given the privacy issues with the contacts and now photos, I wouldn't be surprised if Apple is working on a much better way of protecting private data than the current piecemeal system. I'm thinking something like tiered access -- an app might be able to access limited parts of the contacts without needing to ask the user, for example. Since one of the things developers have used the contacts for is matchmaking on social networks, Apple could easily provide a unique hash for a given contact card and provide that back to an app without giving out the actual data.

For photos, Apple could provide APIs to get thumbnail representations, or photos without location data. Or even pre-canned UIs that would allow a user to select an individual photo to allow the app access to without having to give the app total access to the photo library.

Similar to how they went about introducing multitasking to iOS, I think Apple is going to do the same with the various forms of private data available to developers. It's going to take some time, of course, but the end result will be a much more secure system without introducing much burden onto the users.

What we should be talking about is ownership. We should be fighting to ensure that all data about an individual is legally that individual's property, that any entity that uses such data without recompense to the owner is legally a thief, and that any entity that sells an allegedly secure product that turns out not to be is liable for damages.

I've worked with the Photo Library APIs. This is exactly how it is intended to work. The protections are there to protect the location data, not the photos themselves. I've always found it interesting that the location data would deserve protection but not the photos themselves (only indirectly because they contain location data), but again this is not how the system was designed.

Given the privacy issues with the contacts and now photos, I wouldn't be surprised if Apple is working on a much better way of protecting private data than the current piecemeal system. I'm thinking something like tiered access -- an app might be able to access limited parts of the contacts without needing to ask the user, for example. Since one of the things developers have used the contacts for is matchmaking on social networks, Apple could easily provide a unique hash for a given contact card and provide that back to an app without giving out the actual data.

For photos, Apple could provide APIs to get thumbnail representations, or photos without location data. Or even pre-canned UIs that would allow a user to select an individual photo to allow the app access to without having to give the app total access to the photo library.

Similar to how they went about introducing multitasking to iOS, I think Apple is going to do the same with the various forms of private data available to developers. It's going to take some time, of course, but the end result will be a much more secure system without introducing much burden onto the users.

Very interesting. Indeed contacts and photos are very sensitive areas now, so I feel Apple will have to have apps request authorisation to access them. Well, that's what ~I~ would like to see anyways.

Yet...how many people go on Facebook every day and give apps there permission to just about all their data with no idea how the app will actually use it???? Where is the Times article about that? I block ALL apps on Facebook mainly because they all seem to want access to ALL my data even though they really don't need that much information in the first place. Trusting an app on iOS is still WAY safer than trusting one on Facebook. IMHO

Yet...how many people go on Facebook every day and give apps there permission to just about all their data with no idea how the app will actually use it???? Where is the Times article about that? I block ALL apps on Facebook mainly because they all seem to want access to ALL my data even though they really don't need that much information in the first place. Trusting an app on iOS is still WAY safer than trusting one on Facebook. IMHO

Tom

I realize Apple considers Facebook a "friend" now, but I' don't know that's the case for iOS users themselves. I know Google's privacy actions are suspicious to some of you, but consider the latest on the Facebook app front. The newest requested permission before installing the Android Facebook app wants access to your SMS messages.

If you don't want to grant Facebook the right to look at every text you've sent and every reply you've received, then you can't install their app. Why the heck do they need to read your text messages??

How about giving people the freedom to do what they want with their devices? How about personal responsibility instead of relying on big brother?

How about using an Android. I have yet to run across anything that I can't do on my iPad that jail breaking would enable me to do, thus I've never ventured down that road. Mind you it isn't because I don't use my iPad. In fact I use it for all sorts of things from router configuration, to accessing network shares, to remoting into computers I support. The "I can't do what I want cause it's closed" argument is such garbage, it's just a straw man at this point.

What we should be talking about is ownership. We should be fighting to ensure that all data about an individual is legally that individual's property, that any entity that uses such data without recompense to the owner is legally a thief, and that any entity that sells an allegedly secure product that turns out not to be is liable for damages.

Quote:

Originally Posted by dpnorton82

This. Yes. All of it.

The developer agreement does have language for abuse, but then that only ensures that your app can be pulled if you're caught abusing. As far as recourse beyond that I'd think that would require a lawsuit. Honestly, I don't think anything really needs to be done beyond that, suing someone for theft of your personal content would probably have a strong case if the person was banned from being a developer for breaching that part of their contract with Apple.

"Congress became involved and probably motivated the move [by Apple to shut down the vulnerability], but the legislative body is not going to like what it hears. The problem is that iOS apps not only have access to a users contacts database (including addresses and notes), but apps also have full and unencumbered access to everything in the iOS app sandbox, such as pictures, music, movies, calendars, and a host of other data. Any of this content is literally open for developers to freely transmit to their own servers while apps are open. (note that pictures with geotags will pop up a Location dialog which can be averted in code with some well known tricks)"

"Congress became involved and probably motivated the move [by Apple to shut down the vulnerability], but the legislative body is not going to like what it hears. The problem is that iOS apps not only have access to a user’s contacts database (including addresses and notes), but apps also have full and unencumbered access to everything in the iOS app sandbox, such as pictures, music, movies, calendars, and a host of other data. Any of this content is literally open for developers to freely transmit to their own servers while apps are open. (note that pictures with geotags will pop up a Location dialog which can be averted in code with some well known tricks)"

Unlike the Location Data issue this one is pretty bad for an Apple OS.

Just coming back to note that Android apps were also subsequently found to allow photo uploads too, tho your calendar, music and such is not exposed for any developer who wishes to harvest it unlike iOS. At least as far as I know.

Just coming back to note that Android apps were also subsequently found to allow photo uploads too, tho your calendar, music and such is not unlike iOS. At least as far as I know.

As noted in The Verge article there are no such constraints on desktop OSes... and I wish there were. Any app you run can access anything in your user space and send it back to a server as it pleases. Unless you're using Little Snitch you may not know. Does Apple's Mac App Store sandboxing resolve this? I hope so.

This bot has been removed from circulation due to a malfunctioning morality chip.

As noted in The Verge article there are no such constraints on desktop OSes... and I wish there were. Any app you run can access anything in your user space and send it back to a server as it pleases. Unless you're using Little Snitch you may not know. Does Apple's Mac App Store sandboxing resolve this? I hope so.

According to what I've read even Little Snitch may not tell you just what's going on if the developer isn't using SSL.

I think sandboxing planned for Mac would address the issue.

EDIT: I see ArsTechnica did a followup article. There's some pretty high-profile apps that look like they were grabbing iOS users contact lists along with whatever notes were there without permission: Facebook, Twitter, Cut the Rope, Gowalla and several others.http://arstechnica.com/apple/news/20...n-security.ars