Child Porn Is Apple’s Latest iPhone Headache (Updated)

A photo ostensibly showing a 15-year-old nude girl has turned up in an iPhone app, highlighting Apple’s inability to safeguard its application store from prohibited content.

The image appears in the free app BeautyMeter, which enables people to upload photos that are then rated by others, who assign a star-rating to members’ body parts and clothing. It’s much like an iPhone version of Hot or Not and many similar sites.

The photo to the right (censored by Wired.com) depicts a photo of a nude girl snapping a photo of her reflection in a mirror. In the screenshot, the girl, who is listed as a 15-year-old from the United States, is topless and partially nude at the bottom. Nearly 5,000 users of the app have rated the photo. iPhone app review site Krapps discovered the photo.

The appearance of nudity in BeautyMeter underscored Apple’s difficulties regulating content in its App Store, which has surpassed 50,000 pieces of software available for download. For example, last week, Wired.com reported on an app called Hottest Girls, which released an update for its app to include topless photos of women. Apple pulled the app hours later, saying porn is not allowed.

“Apple will not distribute applications that contain inappropriate content, such as pornography,” an Apple spokesman said regarding Hottest Girls on June 25. “The developer of this application added inappropriate content directly from their server after the application had been approved and distributed, and after the developer had subsequently been asked to remove some offensive content. This was a direct violation of the terms of the iPhone Developer Program. The application is no longer available on the App Store.”

Apple made no similar announcement regarding BeautyMeter. It simply disappeared from the App Store. But in theory people who already had the app can continue to use it, including the upload and rating functionality.

On its web site, BeautyMeter’s developer Funnymals says members of BeautyMeter are required to provide their iPhone device ID so illegal content can be traced back to the owner of that phone.

“We don’t review each uploaded photo exclusively but from time to time we will clean up,” Funnymals stated in BeautyMeter’s terms and conditions.

As of 1:30 p.m. PDT Wednesday the image of the purported 15-year-old was still in the app.

Funnymals and Apple did not immediately respond to requests for comment. Wired.com has not confirmed the photographed girl’s identity or her age.

Although U.S. federal and state laws prohibit child pornography, Funnymals and Apple will probably not be held liable for the content because they would be protected by the Communications Decency Act, according to Mark Rasch, a lawyer and founder of computer security consulting firm Secure IT Experts. That’s because when Apple approved the app, it did not contain the prohibited content. Instead, the app downloads images off the internet, thus placing the responsibility on the people who use the app.

However, Rasch said he expects Apple to remove the application, or the developer to remove the content, once made aware of it.

“They probably don’t have liability unless they have actual knowledge, in which case they have at least a legal or moral duty to act,” Rasch said.

Here’s The Thing With Ad Blockers

We get it: Ads aren’t what you’re here for. But ads help us keep the lights on. So, add us to your ad blocker’s whitelist or pay $1 per week for an ad-free version of WIRED. Either way, you are supporting our journalism. We’d really appreciate it.