Vine was seemingly staying out of it, having covered its proverbial behind quite thoroughly with terms of service lingo like "sole responsibility of the person who originated such Content" and "are not required to monitor or control the Content," etc., etc.

Well, it looks like the short shots of smut have finally spurred the Twitter-owned app to take action as The Verge is reporting that Vine is now blocking "many searches for pornographic terms."

Searching for #porn, #sex and hashtag-prefixed parts of the human anatomy results in no results, at least at the moment.

Still unripe

Vine's pruning attempts aren't entirely successful: #pornvine and #nsfw (NSFW/not safe for work) searches are still allowed as is users' ability to tag Vines with pornographic hashtags. Others can still click on those hashtags and a feed of videos will appear just like before.

The applications of the app are still extremely fresh if not a little unripe as evidenced by the deluge of debauched postings and other missteps. Earlier on Monday, a very NSFW clip was posted as the app's top "Editor's Pick."

Although it didn't play automatically and was hidden behind a filter, because of its ranking a good many Vine users were likely exposed to the content.

Twitter later said it was the result of human error that led to the post being pushed to the Picks and that the video was immediately removed after it was found out.

Low blows

The hits didn't stop there, though: Business Insider reported that it appears Apple removed Vine from its "Editor's Choice" list in the App Store on iPhone following the incident.

Apple said images of nudes available in the app fell in the category of porn, though the developer maintained the images were "art" and not distasteful. Apple asked 500px's developers to put safeguards in place, which the app said it has submitted for consideration.

TechRadar asked Apple where it stands on Vine and will update this story if and when the company responds.