“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Zuckerberg wrote.

He continued, “If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

Facebook has been criticized recently for not doing enough to prevent videos — such as a murder in Cleveland and a killing of a baby in Thailand — from spreading on its service.

“Criminals are taking to Facebook and that’s not counting the horrific images of teens committing suicide and broadcasting it live,” said Rob D’Ovidio, Cyber Security Expert and Associate Professor of Criminology and Justice Studies at Drexel University.

Videos and posts that glorify violence are against Facebook’s rules. But in most cases, they’re only reviewed and possibly removed if users report them.

“It’s not gonna be a one and done deal. These 3,000 people aren’t gonna solve the problem. It’s gonna help them get through that backlog of complaints,” said D’Ovidio. “It’s never gonna be quick enough especially with Facebook live.”

News reports and posts that condemn violence are allowed. This makes for a tricky balancing act for the company.