Facebook leans towards freedom of expression, demoting viral posts containing false information, rather than removing them from the platform entirely.

But under the new strategy, Facebook will work with selected external organisations to determine whether posts contain inaccurate information that may lead to violence. If both boxes are ticked, Facebook will take down the post, The Wall Street Journal said.

The Journal said the strategy will be introduced in Sri Lanka and Myanmar initially — two countries where Facebook has been criticised for contributing to social unrest.

Swisher asked Zuckerberg if he felt responsible for the deaths, to which he replied: "My emotion is feeling a deep sense of responsibility to try to fix the problem."

"I wanna make sure that our products are used for good. At the end of the day, other people blaming us or not is actually not the thing that matters to me," the CEO explained.

"What matters to me is how are people using our services, and are we acting as the force for good that I know we can and have a responsibility to [be]. It's not that every single thing that happens on Facebook is gonna be good. This is humanity.

"People use tools for good and bad, but I think that we have a clear responsibility to make sure that the good is amplified and to do everything we can to mitigate the bad."

Zuckerberg said Facebook has already made progress in Myanmar, where it has "significantly ramped up the investment in people who speak Burmese." He said this is helping the company figure out who is promoting hate speech and what kind of content will incite violence.