Vine blocking NSFW content, hopes Apple doesn’t find its porn stash

It seems that Vine is determined to pretend that the Web wasn’t made for porn. After bringing the Internet practically to its knees with news that one of its editors liked a six-second pornographic video, the young platform is now doing all it can to block searches for NSFW content.

Our collective panic comes only four days after Vine’s illustrious debut in the App Store, as many wondered if this was the next big thing. This morning, however, things took a turn when Vine users were greeted with an explicit video featuring a woman and a sex toy at the top of Vine’s Editor’s Picks. Sacre bleu! Twitter, quick to defend its budding new social platform, blamed “human error” for the video surfacing and quickly deleted it. However, the damage was done, and Vine has since gone into damage control-mode to prevent content containing tags like “porn,” “nsfw,” and “nsfwvine” from popping up in any searches.

In truth, no one should have been surprised. Vine’s users were simply abiding by Rule 34: If it exists, there’s probably a porn version of it. However, what’s most upsetting about the slip is that all users were exposed to the video – we’re talking children, t00 – and that puts the app squarely in violation of Apple’s stringent App Store regulations. Thus, Vine’s developers were forced to self-censor content in a way that smacks of puritanical shaming so they won’t have to the app being pulled from the store outright.

Apple, for its part, was likely beside itself when news broke. Though Vine is still very much active in the App Store, it is no longer being actively promoted in the Editor’s Picks section. Apple’s own policy on pornography is as rigid as it is inconsistent. Apple only recently booted popular photography app 500px for displaying nudity, though apps like Tumblr (which is a hotbed for porn-lovers) and even Snapchat are alive and well.

We’ll keep you updated as the story unfolds. Or undresses? Either way, stay tuned.