Study on Health-and-Fitness Apps Could Spur Privacy Action

In the study, the FTC found 12 mobile health-and-fitness apps that sent users’ personal information to 76 third parties.

Jah-Juin Ho, an attorney in the FTC’s mobile technology unit, said these third parties get some technical information about the users’ phones but also metrics and characteristics about their bodies.

All of the third parties received phone information: the device’s screen size, model and language setting. And 18 of the 76 collected exact information: the phone’s unique device identifier, the phone’s media access control address, and its international mobile station equipment identity.

But others culled detailed consumer information: running routes, eating habits, sleeping patterns and even the cadence of how they walk or run; 22 of the 76 third parties gathered data on users’ exercise information, meal and diet information, symptoms, gender, geo-location information and ZIP codes.

Four apps sent data to one specific ad company without anonymizing the information. “It wasn’t uncommon for third parties to identify users by their first name, last initial and then a stream of identifiers,” Ho said.

Ho didn’t name any of the apps, but the study examined two daily activity apps connected to wearables, two exercise apps, two dietary and meal apps and three system checker apps. “We were as permissive as possible, meaning that if an app asked us for permission to access a certain feature or to sync with another app, we always accepted and opted in,” Ho said.

The Commission’s chief technologist, Latanya Sweeney, says the agency is concerned consumers could be penalized based on health data; for instance, a financial institution might adjust credit ratings based on the fact someone has a disease, she suggested.

The FTC doesn’t have any major health data privacy initiatives in the works, according to a spokesman, but the agency is adamant about protecting consumers from having their health, medical and fitness data shared without their knowledge determining things like insurance rates or drug pricing. A U.S. Senate bill introduced earlier this year was prefaced by a December 2013 Senate Commerce Committee report showing how sensitive health and other personal data is compiled by data firms.

“As we accrue this data and collate it and use it, it is going to be harder and harder to draw that line of what’s health [data] and what isn’t,” said Joy Pritts, chief privacy officer for the Office of the National Coordinator for Health Information Technology at the Department of Health and Human Services. “I think people’s spending patterns, for example, would never occur to you to be health data, yet that model may be used at some point to treat you and then it does become your health information, doesn’t it?”

Joseph Lorenzo Hall, the chief technologist for the Center for Democracy & Technology, said apps could be putting user’s physical safety at risk.

“If you’re talking about running routes and things like that, you may be able to predict where someone is alone and when they’re not at home and that can be extremely sensitive given your own personal context,” Hall said.