Looking for the video?
We have the summary! Get the key insights in just 5 minutes.

Rating

8Overall

9
Importance

8
Innovation

8
Style

Recommendation

From inception, Facebook has championed conversation and community. But the use of Facebook advertising tools to flood Americans with disinformation and hate speech in 2015–2016 was a wake-up call for the company. In this Aspen Ideas Festival exchange, Facebook’s chief product officer Christopher Cox sits down with Wired editor in chief Nicholas Thompson to discuss how Facebook will prevent future abuses. getAbstract suggests this talk to Facebook users and anyone concerned about social media hacking.

In this summary, you will learn

How Facebook changed to combat spammers and fake news, and

How transparent protocols invite an open discussion about Facebook’s policies.

About the Speaker

Christopher Cox is chief product officer of Facebook and its subsidiaries Instagram, WhatsApp and Messenger.

Summary

Facebook once saw its platform and users as a “force for good.” However, the gaming of the platform to subvert the 2016 US presidential election challenged this view. Since then, the company has become more expert at recognizing disinformation and spam. In the trade-off between privacy and easy use, Facebook protocols have shifted toward privacy. And in a trade-off for safer communities, the platform has increased limits on free speech. Facebook now employs hundreds of people to help protect elections in more than 40 countries. These staffers work with election commissions and other experts. In 2018, they took down 10,000 pages of disinformation before the Mexican presidential election. Thus, though “bad actors” with access to artificial intelligence (AI) keep targeting Facebook, the platform maintains a safe lead.