Must unsee TV —

Facebook’s auto-playing videos in an ISIS era [Updated]

Nothing heart-touching about this video. Facebook users were caught by surprise after their friends (assumedly now former friends) shared it, and its news feed appearance resulted in a massacre of children being auto-played.

Share this story

Further Reading

A few months ago, Facebook changed its default settings to enable auto-play of video content on the social network's news feed, whether users accessed the site on a desktop browser or through its mobile app. Even though the latter has auto-play enabled by default with an "only on Wi-Fi" asterisk, the change has swept through millions of news feeds, perhaps as a way to ease users into Facebook's video advertising initiative.

Now, users are calling that default video-play toggle into question thanks to a rise in disturbing content distributed via social media. Should an ISIS beheading or similarly disturbing content find its way to someone's Facebook news feed while that user hasn't opted out of the site's video feature—a process possibly more complicated than it needs to be—they're in for a rude awakening.

It's tough to catalog exactly how many gore-filled videos have been successfully circulated via Facebook without the site intervening or taking them down. Publicly, Facebook representatives have argued that such content isn't subject to removal. And as an example of video auto-play gone wrong, Ars readers directed us to a gory video posted to Facebook that had yet to receive any form of takedown in over a week. Its opening moment features the mass execution of children, all shot by a machine gun, and we chose not to watch the entire video (nor link to it) to see how much worse it got.

If someone links to such a disturbing video on their own Facebook feed, and the site decides that the story is important enough for you to see in your personal news feed, it will begin playing as soon as its video box enters your browser window. More crucially, users can't opt out of the auto-play feature on one platform and expect it to toggle across the board; you'll still need to pick through each Facebook app's separate settings to disable it wherever you browse. That means some users who think they're in the auto-play clear may still be subject to an unexpected shock. (Don't fret, one helpful user posted a full list of ways to opt out of the feature across the board.)

The video we saw was an extreme example—most likely, if you're friends with someone who would share a beheading or massacre video, you're occasionally expecting that sorta thing—but on a broader scale, Facebook doesn't allow users to mark videos with tags that warn of sexual, violent, or other sensitive content. So far, the site hasn't indicated that such an option will ever be made available.

We have reached out to Facebook with questions about its auto-play feature and whether a rise in sensitive video content may result in changes to how the feature works or how it's explained to users. We will update this post with any response. We also "reported" the video mentioned above with a note about its graphic violence, to see how the site would respond, and within an hour, Facebook's moderation team denied our report, saying, "We reviewed the video you reported for containing graphic violence and found it doesn't violate our Community Standards."

Update, 11/25/14 at 11:45 a.m. CST: Facebook has since added a splash warning to the video we referenced in this article, advising users that "this video contains graphic content and may be upsetting to some people," even though we never received a response to our original content "report" that indicated Facebook mods would take such an action. These splash warnings appear to be relatively new to the service; we were only able to find one other instance of such a splash blocking a video from being auto-played on a Facebook news feed, also from November of this year.

The company officially responded to Ars's questions with the following statement: "We ask that people warn their audience about what they are about to see if it includes graphic violence. In instances when people report graphic content to us that should include warnings or is not appropriate for people under the age of 18, we may add a warning for adults and prevent young people from viewing the content." However, Facebook still doesn't allow video uploaders to flag their own content in that manner. Currently, those users must wait for Facebook to respond to reports before such an auto-play block is instituted for an individual video.