Google hit with FTC complaint over 'inappropriate' kids apps

A Google spokesperson said the company takes “these issues very seriously and continues to work hard to remove any content that is inappropriately aimed at children from our platform.”Artur Debat / Moment Editorial/Getty Images

Breaking News Emails

Get breaking news alerts and special reports. The news and stories that matter, delivered weekday mornings.

The Federal Trade Commission is being asked to investigate how apps that may violate federal privacy laws that dictate the data that can be collected on children ended up in the family section of the Google Play store.

A group of 22 consumer advocates, led by the Institute for Public Representation at Georgetown University Law School, filed a formal complaint against Google on Wednesday and asked the Federal Trade Commission to investigate whether the company misled parents by promoting children’s apps that may violate the Children’s Online Privacy Protection Act (COPPA) and Google’s own policies.

“The business model for the Play Store’s Family section benefits advertisers, developers and Google at the expense of children and parents,” Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, said in a statement. “Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases.”

Among the examples cited in the complaint are a “Preschool Education Center” app and a “Top 28 Nursery Rhymes and Song” app that access location, according to an analysis by privacy research collective AppCensus. Other apps, including "Baby Panda's Carnival" and "Design It Girl - Fashion Salon," were among those listed that sent device identification data to advertising technology companies, allowing them to build a profile of the user.

The complaint also spotlights several apps that may not be age appropriate, including “Dentist Game for Kids,” which lets the player give the virtual patient shots in the back of their throat. Another game, “Doctor X & the Urban Heroes,” requires players to cut clothing off of a patient.

A number of apps were also spotlighted based on parent reviews complaining about excessive in-app purchases.

A Google spokesperson said the company takes “these issues very seriously and continues to work hard to remove any content that is inappropriately aimed at children from our platform.”

“Parents want their children to be safe online and we work hard to protect them. Apps in our Designed for Families program have to comply with strict policies on content, privacy, and advertising, and we take action on any policy violations that we find,” a Google spokesperson said in a statement.

Later on Wednesday, Sens. Edward Markey, D-Mass., Richard Blumenthal, D-Conn., and Tom Udall, D-N.M., sent a letter to the FTC calling for an investigation into "whether Google is abiding by its commitment that the Family section of its app store only include advertising and content that is appropriate for children."

Google marks apps that are suitable for children with a star and the recommended age group. Google said it removed thousands of apps this year from its family program after it found policy violations. In addition, Google said one-third of applicants to the program were rejected in 2018.

The complaint is just the latest scrutiny of the Google Play store. Earlier this year, researchers analyzed 6,000 free children’s Android apps and found that more than half shared details with outside companies in ways that could violate COPPA. A study from the University of Michigan looked at 135 apps marketed by Google to children under the age of 5 and found that 95 percent of the apps had some kind of advertising. Additionally, more than half had pop-up ads that were difficult for a young child to close, according to the study.

The FTC has a history of taking action against app makers who have been found to violate COPPA. TinyCo, a company that makes gaming apps including Tiny Pets, Tiny Zoo, Tiny Monsters, Tiny Village and Mermaid Resort, was fined $300,000 in 2014 and ordered to delete any information it collected from children under the age of 13. The app had offered extra in-game currency if users shared their email addresses, however there was not an option for parental consent, according to the FTC.

Google removed an app based on the show “Blaze and the Monster machines” in January after a sinister recording of a voice in the app threatening children with a knife went viral, prompting parents in the U.K. to complain.

Alyssa Newcomb

Alyssa Newcomb is an NBC News contributor who writes about business and technology.