After a bit of confusion over why a popular photography app was pulled from the iOS App Store early Tuesday morning, Apple offered up an official explanation: it had no choice but to pull the app because of some complaints of child pornography viewable within the app.

“The app was removed from the App Store for featuring pornographic images and material, a clear violation of our guidelines. We also received customer complaints about possible child pornography. We’ve asked the developer to put safeguards in place to prevent pornographic images and material in their app,” an Apple spokesman said in a statement Tuesday evening.

Oleg Gutsol, CEO of 500px, told GigaOM that Apple never mentioned any complaints about child pornography and that the company would not allow it on the app anyway. “We never received any official complaints of child pornography. We don’t allow pornography on the site,” he said. “If it ever happened, we would have reported it to the police immediately.”

The Apple reviewer told the company that the update couldn’t be approved because it allowed users to search for nude photos in the app. This is correct to some extent, but 500px had actually made it tough to do so, explains Tchebotarev. New users couldn’t just launch the app and locate the nude images, he says, the way you can today on other social photo sharing services like Instagram or Tumblr, for instance. Instead, the app defaulted to a “safe search” mode where these type of photos were hidden. To shut off safe search, 500px actually required its users to visit their desktop website and make an explicit change.

Gutsol said that his company had only heard from Apple it was easy to find nude photos with search terms like “nude” and “naked.” 500px made changes to its app and resubmitted to the App Store, but that update is still awaiting approval, he said.

Apple has been (mostly) clear from the beginning that it doesn’t admit apps that promote excessive violence, pornography or attacks on religion to its App Store. Steve Jobs himself called it Apple’s “moral responsibility to keep porn off the iPhone.”

While there can be different opinions of what constitutes artistic nude photography versus pornography, when children are involved, there’s no wiggle room. If Apple received such a complaint, it had to take the app down.