Apple on Wednesday announced a future update to iOS will restrict App Store software from accessing a user's address book without their permission.

"Apps that collect or transmit a user's contact data without their prior permission are in violation of our guidelines," Apple spokesman Tom Neumayr said in a statement to AllThingsD. "We’re working to make this even better for our customers, and as we have done with location services, any app wishing to access contact data will require explicit user approval in a future software release."

The official statement came quickly after two U.S. congressmen sent a letter to Apple Chief Executive Tim Cook, asking for more information about Apple's security and privacy policies on the iPhone. The controversy stems from an iPhone social networking application, "Path," which was discovered to be uploading users' address book data to the company's servers without user authorization.

For its part, Path issued an apology and gave users the option to opt out, stating that the data was being used to streamline the application's "Add Friends" feature. But Apple, in its official comment on Wednesday, made it clear that the actions taken by Path are in violation of its iOS developer guidelines.

The events share some similarities with last year's location database controversy, in which members of the U.S. government demanded answers from Apple about a file found hidden in the iPhone operating system that kept an extensive log of location data. Apple said the crowd-sourced data, which represented cellular towers and Wi-Fi hotspots pinged by the iPhone, was intended to give users faster response times when using location-based services.

That controversy quickly became a non-issue when Apple issued an iOS software update, which reduced the size and scope of the database file, and gave users the ability to delete it by turning off location services on their iPhone.

The events share some similarities with last year's location database controversy, in which members of the U.S. government demanded answers from Apple about a file found hidden in the iPhone operating system that kept an extensive log of location data. Apple said the crowd-sourced data, which represented cellular towers and Wi-Fi hotspots pinged by the iPhone, was intended to give users faster response times when using location-based services.

I really don't think it has many similarities at all. The prior issue was simply a file on your phone that was storing location data, I don't think third parties really had access to it. This is all your contact information loaded onto a third party server without your permission.

when the path debacle arose, my first thought is that software companies should allow their users to 'opt-in' when they want to harvest information, rather than the opposite.

this is awesome!

While this is good news it's plugging a small leak in a dam while the water gushes from many other places. Many sync their address books between Mac and Windows with every app we install having access to our data.

Quote:

Originally Posted by Gatorguy

Getting ever closer to Android's permission-based app model. . .

You mean going backwards? Because having a list over 20 items an app may access that appears when you install the app isn't sensible and therefore isn't security.

While you may take heed that a wallpaper app is trying to get access to your contacts most people just click through confusing and technical lists.

This bot has been removed from circulation due to a malfunctioning morality chip.

I really don't think it has many similarities at all. The prior issue was simply a file on your phone that was storing location data, I don't think third parties really had access to it. This is all your contact information loaded onto a third party server without your permission.

To be fair, you're supposed to read the article and at least think it's pretty much the same as last year's location issue.

Yeah, Apple needs to figure out what can be vetted during the app approval process and what needs to be enforced at runtime. Some other abuse prone areas are unrestricted network/internet access, unlimited flash storage, and full read access to the iPod library.

You mean going backwards? Because having a list over 20 items an app may access that appears when you install the app isn't sensible and therefore isn't security.
While you may take heed that a wallpaper app is trying to get access to your contacts most people just click through confusing and technical lists.

Going backwards

Did Apple require user-specific permissions at some earlier point? I don't see how it's a bad thing, and if so why Apple is requiring both location and contacts harvesting to be user-authorized now.

Apple on Wednesday announced a future update to iOS will restrict App Store software from accessing a user's address book without their permission.

"Apps that collect or transmit a user's contact data without their prior permission are in violation of our guidelines," Apple spokesman Tom Neumayr said ...

So the closed-garden control of the App Store doesn't always work, after all?

Quote:

Originally Posted by Gatorguy

Getting ever closer to Android's permission-based app model. . .

What are you talking about? Apple invented the permission-based app model. Or perfected it. Or made it popular. Or... forget it, Apple is making billions from the permission-based app model like it's nobody's business! /s

Quote:

Originally Posted by SolipsismX

...
You mean going backwards? Because having a list over 20 items an app may access that appears when you install the app isn't sensible and therefore isn't security...

Every educated Android user knows better than to install any app offering this laundry list of permission, unless it comes from a very respectable source with legitimate reasons for the requests. Don't worry, you'll learn to use it as Apple improves iOS.

Did Apple require user-specific permissions at some earlier point? I don't see how it's a bad thing, and if so why Apple is requiring both location and contacts harvesting to be user-authorized now.

Security that isn't used is no security at all. For example, let's say MS updates Windows log-in permissions so that you either have the choice of using only a randomized alphanumeric password or no password at all. Users will go for no password because the security, even though much higher security than there previous passworded option, is now too complex to bother with. That screenshot shows what is wrong with Android's system and it only show 1/4 of the potential permissions.

This bot has been removed from circulation due to a malfunctioning morality chip.

So the closed-garden control of the App Store doesn't always work, after all?

What are you talking about? Apple invented the permission-based app model. Or perfected it. Or made it popular. Or... forget it, Apple is making billions from the permission-based app model like it's nobody's business! /s

Every educated Android user knows better than to install any app offering this laundry list of permission, unless it comes from a very respectable source with legitimate reasons for the requests. Don't worry, you'll learn to use it as Apple improves iOS.

no they don't.. especially when you consider that educated android user does not exist. they use android without knowing what android is. to put it simply, they buy the only available 200 dollar smartphone that isn't nokia and that's it.

It's a bad thing to offer advertise a security that isn't used because it's not designed to inform the average in a way that is useful to them. For example, let's say MS updates Windows log-in permissions so that you either have the choice of using only a randomized alphanumeric password or no password at all. Users will go for no password because the security, even though much higher security than there previous passworded option, is now too complex to bother with. That screenshot shows what is wrong with Android's system and it only show 1/4 of the potential permissions.

A quarter of them? I can't imagine what the other 21 would be.

I see these listed:
~Services that cost you Money (that a good one to know about, don't you think?)
~Storage - You already showed this one
~Your Personal Information - You showed that one too, and Apple agrees with getting your permission
~Phone call - Yup, that's in your screenshot
~Location - Another I think you should know about, and so does Apple
~Network Communication - In your screenshot and something you better know about.
~System tools - Again in your list
~Hardware controls - Not of much use IMO, unless you're worried why a kid's game wants to turn on the camera.
~Your Accounts ~ Another permission that's not really useful IMO.

no they don't.. especially when you consider that educated android user does not exist. they use android without knowing what android is. to put it simply, they buy the only available 200 dollar smartphone that isn't nokia and that's it.

no they don't.. especially when you consider that educated android user does not exist. they use android without knowing what android is. to put it simply, they buy the only available 200 dollar smartphone that isn't nokia and that's it.

Is it your opinion that it's better that an app doesn't let you know what services or information it's accessing? It's clear that the App Store has become so large that Apple can't possibly thoroughly vet every app and every process it's accessing.

no they don't.. especially when you consider that educated android user does not exist. they use android without knowing what android is. to put it simply, they buy the only available 200 dollar smartphone that isn't nokia and that's it.

The issue is less of an issue with Android than iOS because most Android users aren't accessing the Android Market. The devices are the new feature phones for many.

This bot has been removed from circulation due to a malfunctioning morality chip.

Indeed, this has always been against the developers TOG -- Path either didn't care or just didn't RTFM and has now ruined it for the rest of us.

That's beside the point. Apple shouldn't allow any sensitive data to be accessible by 3rd-parties. They should have been proactive, not reactive, about this issue.

What's interesting is that this will get much less press than the location issue which was only sent to Apple, was anonymous, and only contained spots of WiFi and cellular tower data, not your specific location.

This bot has been removed from circulation due to a malfunctioning morality chip.

That's beside the point. Apple shouldn't allow any sensitive data to be accessible by 3rd-parties. They should have been proactive, not reactive, about this issue.

Were they actively allowing it Solipsism? If so then I just changed my opinion whether Apple is responsible and should get the blame.

As I understood it Apple didn't approve to begin with, but instead was unaware. Possibly overwhelmed by the volume of apps submitted to the App Store coupled with developer pressure to get their apps approved? I've seen a few recent AI references to apps that should never have made it into the App Store, leaving some Apple users scratching their heads.

What are you talking about? Apple invented the permission-based app model. Or perfected it. Or made it popular. Or... forget it, Apple is making billions from the permission-based app model like it's nobody's business! /s

Trouble is, up until now, permission to access the contents of your address book was "always on" at runtime. Enforcement of Apple's customer privacy guidelines with respect to the address book was enforced at the source code level as Apple vetted Apps being submitted for inclusion in the App Store. If any snippet of source code violated Apple's guidelines regarding the privacy of the customer's address book, but it managed to slip through Apple's vetting process during sumbittal to the App Store, then that code could run without completely unfettered and without the need to notify the user or obtain the user's permission.

It was a trust-based permission, in that the customer had no choice whatsoever but to trust that Apple was doing an adequate job of vetting the source code.

Now we're apparently moving to an enforcement-based permission system. If they do it right, that will be the better system.

Here's what I would want to see: A pop-up shows up the first time an App tries to access any particular protected API, describing the nature of the information the App is trying to access, and asking the user for explicit decision to either to allow the App to continue or abort the access. It would also include an option to "remember the decision" so that the user can choose to be notified on each future attempt to access data of the same nature, or else the user can choose to give the App perpetual permission to access that type of data without the repeated nagging.

I hope the day comes soon when people realize that all this sharing is doing nothing to improve their lives and if anything keeps them chained in service to a computer whether mobile or at home wasting time when they could be out in reality. Everything you post online will be used against you someday. The more info you willingly give away to software companies and governments the less human you become.
I do not tweet, have no Facebook page or linkedin page and a healthy social life in spite of it all.

Here's what I would want to see: A pop-up shows up the first time an App tries to access any particular protected API, describing the nature of the information the App is trying to access, and asking the user for explicit decision to either to allow the App to continue or abort the access. It would also include an option to "remember the decision" so that the user can choose to be notified on each future attempt to access data of the same nature, or else the user can choose to give the App perpetual permission to access that type of data without the repeated nagging.

That's what Android does now. It won't ask again unless the permissions change. Even better, it won't allow an automatic updating of any app where the permissions have changed since you installed it. You have to update manually and agree that you've noted and accept the permission changes.

Were they actively allowing it Solipsism? If so then I just changed my opinion whether Apple is responsible and should get the blame.

As I understood it Apple didn't approve to begin with, but instead was unaware. Possibly overwhelmed by the volume of apps submitted to the App Store coupled with developer pressure to get their apps approved? I've seen a few recent AI references to apps that should never have made it into the App Store, leaving some Apple users scratching their heads.

As I understand it Apple didn't prevent anyone from getting access, they simply put up written rules that said you were not allowed. Laws are guidelines, not security. This is why I say Apple should have been proactive to prevent this from ever being an issue.

This bot has been removed from circulation due to a malfunctioning morality chip.

I hope the day comes soon when people realize that all this sharing is doing nothing to improve their lives and if anything keeps them chained in service to a computer whether mobile or at home wasting time when they could be out in reality. Everything you post online will be used against you someday. The more info you willingly give away to software companies and governments the less human you become.
I do not tweet, have no Facebook page or linkedin page and a healthy social life in spite of it all.

Thanks for sharing, Kerry Buckley.

This bot has been removed from circulation due to a malfunctioning morality chip.

...
It was a trust-based permission, in that the customer had no choice whatsoever but to trust that Apple was doing an adequate job of vetting the source code.

Many here will testify that most users will be happy with that model and defend it even when it fails.

Quote:

Now we're apparently moving to an enforcement-based permission system. If they do it right, that will be the better system.

Here's what I would want to see: A pop-up shows up the first time an App tries to access any particular protected API, describing the nature of the information the App is trying to access, and asking the user for explicit decision to either to allow the App to continue or abort the access. It would also include an option to "remember the decision" so that the user can choose to be notified on each future attempt to access data of the same nature, or else the user can choose to give the App perpetual permission to access that type of data without the repeated nagging.

That's very well though out and indeed a step closer to the current Android model, which is far from perfect itself. There has been some research on the topic of information-stealing by smartphone applications. The authors of the linked article suggest that the user should be able not only to control what information to share, but also to alter this information with empty or bogus entries.

As I understand it Apple didn't prevent anyone from getting access, they simply put up written rules that said you were not allowed. Laws are guidelines, not security. This is why I say Apple should have been proactive to prevent this from ever being an issue.

It's a shame that you're demanding Apple treat all of their developers as criminals. Because some act poorly, Apple should punish everybody. What you want is developers with ethics. Perhaps they're rare now.

Here's what I would want to see: A pop-up shows up the first time an App tries to access any particular protected API, describing the nature of the information the App is trying to access, and asking the user for explicit decision to either to allow the App to continue or abort the access.