Facebook is hiring monitors — 3,000 of them — to watch over live video content and remove videos that show violent acts such as murder or suicide, CEO Mark Zuckerberg announced Wednesday.

The move comes as Facebook faces increasing criticism for not catching and removing videos of the murder of an elderly Cleveland man live on Facebook as well as other disturbing incidents.

Facebook hopes the monitoring could eventually be done with computers, but for now, Zuckerberg said in a Facebook post that he wants to “build a safe community,” remove violent content quickly, and help those who might be planning to harm themselves.

Zuckerberg also said Facebook will be making it easier to report videos and posts of concern. He noted that last week, Facebook was able to facilitate law enforcement intervening in a possible suicide and prevent the person from hurting himself, but that other times they have not been as fortunate.

The social media site, which has nearly 2 billion users, relies on posts being reported in order to determine whether content needs to be removed. Currently, 4,500 monitors respond to millions of reports each week about content that could violate its terms of service.

Some Twitter users were concerned about the mental health of the new monitors and hoped AI would be able to do the job fairly soon.