Can Mark Zuckerberg Save the World from Itself?

The year 2016, as Mark Zuckerberg tells Farhad Manjoo in a new feature in The New York Times Magazine, was “interesting for us.” Facebook, which had long downplayed its role as a media company, had come under intense criticism for not policing fake or otherwise misleading news stories on its platform, which many blamed for helping to elect Donald Trump. At first, Zuckerberg dismissed the accusation as a “pretty crazy idea.” But as time passed and the company did some soul-searching about its responsibilities as an arbiter of information, Facebook eventually conceded that it had not done enough to help users sort through the torrent of lies and half-truths that were populating their News Feeds. Something, Zuckerberg decided, needed to change.

The internal debate at Facebook predated the election. Last May, after Gizmodo reported that “news curators” had suppressed links to conservative Web sites in Facebook’s Trending section, Facebook replaced its news team with an algorithm, ostensibly removing human biases from the equation. Months later, Facebook came under fire again when it deleted a Norwegian newspaper’s publication of an iconic Vietnam War photo that contained child nudity. But the events of November 8, for many, seemed to crystallize the need for Facebook to adopt a more coherent approach to its editorial process. Shortly after the election, Zuckerberg announced that Facebook would begin taking steps to stop the spread of fake news, including the introduction of fact-checking features. In February, he released a nearly 6,000-word manifesto making it Facebook’s mission not only to connect the world but “to amplify the good effects and mitigate the bad.”

Zuckerberg is clearly somewhat uncomfortable in his new role as editor-in-chief, which is presumably why a Facebook spokesperson invited Manjoo and fellow Times reporter Mike Isaac to Menlo Park shortly before Zuckerberg published his manifesto. As Manjoo recounts, the 32-year-old C.E.O. appeared at pains to emphasize that his newfound activism was not about Trump and that the changes he was making to Facebook were designed to solve a user-experience problem. But he also seemed unwilling to confront, or at least publicly acknowledge, the incredible power of the platform he wields. At some level, fake, misleading, or simply hyper-partisan news stories have proliferated on Facebook because people like reading and sharing them. Whether they are good for users is a separate question—one that places Facebook at the center of a seething political debate.

In some ways, Facebook has been down this road before. When Upworthy began to dominate social media in 2013, filling News Feeds with “clickbait” stories, Facebook grappled with whether human editors should remove them manually. “In surveys, people kept telling Facebook that they hated teasing headlines. But if that was true, why were they clicking on them?” Manjoo writes. “Was there something Facebook’s algorithm was missing, some signal that would show that despite the clicks, clickbait was really sickening users?” Eventually, Facebook hit upon an engineering solution: it looked for signs of users’ dissatisfaction with the stories they’d just read, and, after some tweaks, Upworthy’s stories appeared less and less frequently.

It’s possible that Facebook could do the same with misinformation and fake news stories. Manjoo reports that Adam Mosseri, Facebook’s vice president in charge of News Feed, “has begun testing an algorithm change that would look at whether people share an article after reading it. If few of the people who click on a story decide to share it, that might suggest people feel misled by it and it would get lower billing in the feed.”

On the surface, it’s another simple engineering fix to a user-experience problem. But there are critical differences between clickbait and misleading news stories. As Manjoo notes, “Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data.” But “decades of social-science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.” Addressing that paradox requires fighting both human psychology and market forces. It’s a titanic collective action problem that is essentially philosophical in nature.

Zuckerberg seems aware he’s stumbling into uncharted territory. When Manjoo asked him whether he talked to Barack Obama about the then-president’s many public critiques of Facebook, which Obama has accused of worsening the “bubble” problem, Zuckerberg “paused for several seconds, nearly to the point of awkwardness, before answering that he had.” Later, Facebook representatives called Manjoo to stress that Zuckerberg had spoken to many people about the manifesto, implying that it wasn’t to be read as a partisan, anti-Trump document. It’s a sore subject in Silicon Valley, where Trump’s election clearly left a deep psychic scar.

If Zuckerberg’s mission to rid Facebook of fake news is secretly a utopian one, it also has a dark side. The reason Facebook has so much influence over what people read, watch, and share is that it is more than just a digital media powerhouse and traffic driver for publishers. With control over 77 percent of all social traffic, Facebook is also a monopoly. And what Zuckerberg is talking about when he muses, elliptically, about solving the misinformation problem, is using Facebook’s clout to tell people what to think.

Surely, he sees the issue in a more positive light. Like most Silicon Valley luminaries, Zuckerberg wants to save the world from itself. But Facebook is wandering onto a treacherous path. As long as Zuckerberg can maintain Facebook’s iron grip on digital media, his strategy to control misinformation on the platform he created more than a decade ago could work. But if there’s one thing that’s outside his control, it’s the populist forces unleashed by the Internet. And the tighter Zuckerberg squeezes, the more users may slip through his fingers. Already, there is a building backlash on the right against Facebook’s efforts to add fact-checking features. And despite Facebook’s dominance, there’s plenty of places for the discontented to go. By trying to bring order to the digital media landscape, Zuckerberg might end up only fragmenting it more.