Criticism of Facebook has taken various forms. One of the key complaints is that its news algorithms serve to create an echo chamber, exaggerating the already-present confirmation bias effect. But there are also concerns that fake news is a serious problem on the platform.

Writing in a post on his own Facebook page over the weekend, Mark Zuckerberg said: "After the election, many people are asking whether fake news contributed to the result, and what our responsibility is to prevent fake news from spreading. These are very important questions and I care deeply about getting them right".

With so much of Facebook's money coming from advertising, accusations of promoting fake news -- or, indeed, influencing the outcome of the election -- could have serious implications. As such, it is hardly surprising to find that Zuckerberg goes on to say:

Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.

That said, we don't want any hoaxes on Facebook. Our goal is to show people the content they will find most meaningful, and people want accurate news. We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further.

Zuckerberg recognizes that "identifying the 'truth' is complicated", and he commits to keeping users updated about how changes to the operation of newsfeed will affect them.

While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.

As we continue our research, we are committed to always updating you on how News Feed evolves. We hope to have more to share soon, although this work often takes longer than we'd like in order to confirm changes we make won't introduce unintended side effects or bias into the system.