Apple Pulls Photo Apps From App Store Because It Doesn’t Want You Seeing Any Nudie Pics

While what you do with your smartphone in the privacy of your own home is totally your business, Apple has a strict policy against pornographic images, or really, any nude photos being searchable on applications made for iOS. As such it has issued a smackdown against two of Canadian company 500px’s popular photo-sharing apps, pulling them from its app store citing nudie shenanigans.

The two apps combined have been downloaded just about a million times, reports TechCrunch, so their disappearance from the store is likely ticking off plenty of users. It all started when an App Store reviewer started a discussion with Apple about the most recent version of 500px, reports TechCrunch.

Apparently the reviewer said the update should be slammed because it was too easy for users to search for nude photos using the app. The company claims that maybe that’s kind of true, but it had made it really difficult for users to do so. In order to see anything naughty, users had to go to their desktop website and turn off a “safe search” mode.

It isn’t that the company approves of pornography, but there are some nudes that kids shouldn’t just stumble upon if they’re using the app. And to be clear, says 500px, pornography itself isn’t allowed on the site, but naked people aren’t by nature pornographic. Nudes can be fine art, after all.

“Some people are mature enough to see these photos,” the company’s Chief Operating Officer tells TechCrunch, “but by default it’s safe.”

For its part, Apple explained the decision to pull the apps thusly:

The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app.

In order to appease Apple, 500px told the company it would change the apps and take about a day to do so, but apparently those changes didn’t happen fast enough and the apps were yanked yesterday. Those changes are being performed now.

Some in the app community are a bit surprised by Apple’s readiness to pull an app just because there could be nude photos that could maybe show up. Is it really any different when an iPhone user simply hops on the Web and searches for “naked people”?