Mobile app privacy: You get what you pay for

Analysis Mobile app privacy controversies have dominated the technology headlines over recent weeks, but the push for tighter privacy standards may upset existing business models, which often use targeted advertising to subsidise the price users pay for the apps.

Last month it was discovered that iPhone apps Path and Hipster were uploading user address book information without informed consent.

Meanwhile Twitter was criticised because its privacy policy failed to explain that if users used the “Find Friends” feature on its iOS and Android clients – Twitter would store the user's entire address book for 18 months.

Days after this Facebook was obliged to deny that its iPhone app was reading private text messages. And February was rounded off by the revelation that software granting access to location-finding services can potentially siphon off users' photos on both iPhones and Android devices, revelations that prompted Democrat Senator Charles Schumer to write to consumer watchdogs at the FTC calling for action.

Privacy by Design

The GSM Association (GSMA) has responded to heightened concerns about the privacy of mobile applications with the launch of new guidelines designed to offer punters greater transparency and control over how apps use their personal information.

The new privacy enhancing guidelines for mobile application developers were launched at the MWC conference in Barcelona earlier in March. The framework seeks to make privacy-protecting measures a core part of the mobile software development process, not as an afterthought or an "add-on". The idea of hardwiring privacy into the development process is embodied by the concept of "Privacy by Design" (PbD), where developers would be asked to enable the most restrictive privacy settings by default, for example.

The guidelines seek to bring harmonisation to the widely different approach to privacy applied by disparate developers across multiple companies. That's a big ask – especially since the GSMA wants its approach to be applied across the whole mobile eco-system, including device manufacturers, platforms, and OS companies, mobile operators, advertisers and analytics companies as well as developers.

The approach, which is designed to reclaim users' trust that mobile applications only do what they say on the tin, draws its concepts from existing EU data protection concepts like data minimisation and limiting the use of collected data. Ideas include using clear information when communicating with users and seeking consent before enabling targeted advertising and using location data.

Mobile telcos including Vodafone, Orange and Deutsche Telekom signed up to the policy at MWC12.

However getting major app store players to buy into the approach may be far more tricky. Mark Little, an analyst at Ovum, argued that Apple, Google, Microsoft and RIM are likely to pick and choose from the guidelines because aspects of the proposals run against their existing money-making business models, particularly when it comes to targeted advertising.

“I think the big players are likely to cherry-pick, because with some elements they can do a big promotion that makes them look good in the market. But I think there are also elements that will restrict what they can do and may constrain some of their business models,” Little told Computerworld.

“Giving users more transparency and understanding of data collection and giving them tools to opt-out is obviously going to impact on their targeted advertising business models. I think they are going to conveniently ignore these guidelines,” he added.

“I’m sure they will go very strong on protecting children and other elements, but when it comes to constraining a business model that they actually make money out of, I don’t think that they will follow the guidelines quite so closely”.

Major app sellers recently signed up to a Californian policy to mandate the presentation of privacy policies for apps that access sensitive information (contact lists and unique phone IDs, location, age, gender of users etc) for software available from Apple App Store and the like. Apps that omit access to personal data are not obliged to present a policy.

In addition, Apple is now rolling out around protection of address book information to its App Store, the implications of which are discussed in a blog post by Veracode here.

Both approaches are far less comprehensive than the guidelines advocated by the GSMA, which cover how an application works – not just what users are presented with when they download software, or what the app does with address book information.

"The attempt by the GSMA to increase trust and transparency between users and app companies is to be welcomed," writes Lachlan Urquhart, a legal academic on Sophos' Naked Security blog. "However, unless the privacy-protective measures are almost universally adopted, the industry impact of the overall document may be minimal."

History repeating

The issue of the privacy of mobile apps has come up before. A year ago, a research team from Veracode broke apart a mobile application, Pandora, to see what could be found within the code. They discovered that the app contained as many as five advertisement libraries that could transmit personal data such as the user’s gender, GPS location and details of the phone used.

Veracode said it subsequently also found the AdMob advertising library in other apps such as the free CBS News Android application and the TvDotCom application. A few days later, Pandora removed some of the advertising libraries from its Android and iPhone apps but the incident serves to illustrate a much larger issue, according to Veracode.

“As more and more ‘free’ applications attempt to monetise their offerings, we will likely see more personal information being shuttled out to marketing and advertising data aggregation firms. Application developers may not even be aware of the privacy violations they are introducing by using third-party advertising libraries,” Veracode researcher Tyler Shields warns in a blog post on mobile app privacy.

Graham Lee, smartphone security boffin at O2's Lab, told El Reg that the latest issue of apps having access to photos is not new, and is not restricted to apps that have location services enabled. Two year ago Nicolas Seriot warned in a Blackhat talk (PDF) that apps could not only read address book data but could potentially also modify the address book without user intervention.

Lee argued that greater transparency between app developers and users is needed. This would be better in the long run than the knee-jerk reaction of confronting users with dialogue boxes (they're likely to blindly agree to) whenever private data is accessed, he explained.

Devs are damned if they do, damned if they don't

"The situation is something the app development industry needs to address, because it's eroding customer trust and confidence," Lee said. "Developers got into a position where we can't sell an app for more than 99¢ without it appearing overpriced, but if we want to augment this revenue with income based on advertising or other business-to-business models, we need to communicate this transparently and effectively with our customers and give them control over what's being shared.

"Unfortunately, the knee-jerk reaction to any privacy scare is a demand to put the affected data — whether it's the address book, photos or something else — behind access control gated by a modal dialog that smartphone users must agree to. Too many of those dialogs and we're still going to annoy our customers, when every feature on their phone needs to be confirmed to be used by every app they download. This approach puts control in the hands of the user, but suffers from the problem of blind approval.

"Surely it would be better to demonstrate to our customers that we can form an open, trustworthy relationship with them, and that they don't need to review and approve every move we might try to make," he concluded. ®