Facebook to Hire 3,000 People to Monitor Live Videos of Crime and Suicide

Facebook says it will hire another 3,000 people to review videos of crime and other questionable content following murders shown live on its site. Scott McGrew reports.

(Published Wednesday, May 3, 2017)

Facebook is turning to artificial intelligence to detect if someone might be contemplating suicide.

Facebook already has mechanisms for flagging posts from people thinking about harming themselves. The new feature is intended to detect such posts before anyone reports them.

The service will scan posts and live video with a technique called "pattern recognition." For example, comments from friends such as "are you ok?" can indicate suicidal thoughts.

Facebook has already been testing the feature in the U.S. and is making it available in most other countries. The European Union is excluded, though; Facebook won't say why.

Teen's Social Media Prank Leads to 11-Year-Old's Suicide

Tysen Benz, 11, took his own life on March 14th after he was told a lie that a 13-year-old friend had killed herself. Marquette County prosecutors are looking to charge the 13-year-old girl for using social media to commit a crime.

(Published Tuesday, April 18, 2017)

The company is also using AI to prioritize the order that flagged posts are sent to its human moderators so they can quickly alert local authorities.