The app now includes a "report" button to flag questionable content and is labeled for ages 17 and up. The age restriction means the app has "Frequent/Intense Sexual Content or Nudity," according to the store.

"We take the content that appears on our site very seriously," 500px COO Evgeny Tchebotarev wrote in a blog post today. "If we find content that is in violation of our terms of use, we remove it immediately from our system and block the user's account. Our policies are clearly defined in our terms of service."

500px was pulled initially, according to Apple, because the app store's guidelines include rules against featuring pornographic images and material. Additionally, Apple said it received complaints about child pornography.

Tchebotarev told The Verge that the company is investigating Apple's child porn allegations but has yet to receive the alleged pornographic images or complaints from Apple. 500px didn't find any porn during its internal investigation, Tchebotarev said. He said Apple asked for the "report" button, as well as a tweak to the app's search function that makes it harder to search suggestive keywords. The search restriction is only for default searches, users who check off the "adult content" box should not be affected.

Though Apple enforced its guidelines in this instance, it seems to have let Twitter's new video-sharing app Vine slide. Porn briefly showed up as an "Editor's Pick" by accident on Vine yesterday, followed by Apple quietly pulling the app off its "Editor's Choice" list.