It's an issue that has Twitter scrambling to appease concerns. And it's raising questions about how Apple, the only place where smartphone users can download the app, will respond after recently banning other apps that provide access to sexual content.

The issue gained attention Monday when Vine users noticed a video of what was described as hard-core pornography showcased in the prominent Editor's Picks section of the mobile app.

Users took to the comment section of that video to complain. Twitter apologized Monday, saying it was a mistake.

"A human error resulted in a video with adult content becoming one of the videos in Editor's Picks, and upon realizing this mistake we removed the video immediately," the company said in a statement sent to CNN. "We apologize to our users for the error."

JUST WATCHED

Mom: Vine can expose kids to bad content

MUST WATCH

Mom: Vine can expose kids to bad content02:58

Released Thursday, Vine is a Twitter-owned app that lets users create and share videos lasting up to six seconds. As with photo-sharing app Instagram, Vine users can follow other people, whose posted videos show up in a feed on their phones or can be shared on Twitter and Facebook. The app is available only for the iPhone, iPad and iPod touch.

Vine's Apple-only status raises interesting questions. Just last week, Apple banned image-sharing app 500px from its App Store because it could give users access to sexual content. The 500px app features artistically rendered nudes among lots of other photos, but so do other apps with user-generated content, such as Tumblr, which remain available on Apple's iOS mobile system.

The Vine app is rated 12+ on iTunes for "infrequent/mild sexual content or nudity," among other reasons, meaning it's deemed appropriate for users 12 or older. As of Monday, it was the fifth most popular free app in the App Store.

Apple did not immediately reply to a message seeking comment for this story. Vine had been listed on Apple's own Editor's Choice list as early as Monday morning, but appeared to have been removed by Monday afternoon.

Apple observers Monday were noting the strange position the company finds itself in. Apple has famously kept tight reins on what appears in its App Store and on its mobile operating system in general. The late CEO Steve Jobs famously argued that control on the front end delivers a user experience free from porn, spam and other digital unpleasantness.

But as the digital Web grows and evolves, it's becoming tougher to parse the blurry line between apps that promote adult content (and which are turned down or banned by Apple as a matter of course) and those which simply provide access to such fare.

What, for example, is the distinction between the banned 500px app and the app for Tumblr, the blogging platform which, among its millions of blogs, includes many that host explicit sexual content?

"From the start, Apple has said they'd get the App Store wrong, and come across things they didn't anticipate, but that they'd learn and grow," said Rene Richie, editor of Apple-centric blog iMore, in a post Monday. "This particular problem has been around for years, but as social sharing has become easier, it's come to the surface again."

Interestingly, it was just a little less than a year ago that Apple banned Viddy, a video-sharing app that has been compared to Vine, because it gave access to user-generated adult videos. The Viddy app was eventually returned to the App Store.

On Twitter's end, the anything-goes aspect of Vine jibes with the site's overall philosophy. Compared to Facebook, which believes social sharing is best when tied to a user's true identity and real-world networks, Twitter allows its users to register under fake names and has fought governments and law-enforcement agencies seeking user information.

As such, Twitter has taken a more hands-off approach on adult content. It's not hard to hunt down hashtags its users are employing to share adult content on a daily basis. (#TwitterAfterDark becomes a trending topic on the site nearly every day -- clicker beware).

By contrast, searching for several suggestive hashtags (such as #naked or #porn) on Instagram, the popular photo-sharing app bought by Facebook last year, render no results. Because Instagram is an Apple-like closed environment, sexually explicit images are difficult, if not impossible, to find there.

By Monday afternoon, hours after Vine had apologized and deleted the porn video from its Editors Picks, what uproar there was online was subsiding.

As some observers noted, Vine isn't otherwise experiencing anything new in the tech world.

"As virtually every new video or photo-sharing service has shown us since the dawn of the Internet, from Flickr to ChatRoulette, it's very difficult to keep these sites or apps G-rated. So the companies either learn how to police it well, like Flickr does, or they wither and die, as ChatRoulette did," wrote The Atlantic Wire's Adam Clark Estes in a post titled "Vine has a porn problem because, of course it does."

In a statement to CNN, Twitter noted what had already become apparent on the app -- that users can report videos they deem inappropriate.

"Videos that have been reported as inappropriate have a warning message that a viewer must click through before viewing the video," a spokesperson said in the statement. Reported videos that are determined to violate Vine guidelines will be removed from the site and the user account that posted them may be terminated, according to the statement.

Vine's terms of service ban illegal activity, harassment or abuse and behavior such as impersonating another user or violating trademark and copyright.