Facebook

Facebook Tries to Avoid Another 2016 Election Nightmare

Facebook’s decision to hastily announce a potential disinformation campaign in advance of the midterms is in stark contrast to how Mark Zuckerberg reacted to charges that its platform was used to disseminate disinformation in 2016.

Just hours before polls opened on Tuesday, Facebook made an announcement: thanks to a tip from law enforcement, the tech giant said it had discovered and taken down 30 Facebook accounts and 85 Instagram accounts that “may be engaged in coordinated inauthentic behavior.” Per the announcement, most of the Facebook accounts were in French and Russian, while the Instagram accounts were primarily in English. But aside from those details, Facebook revealed relatively little about the content of the accounts, or their potential origin. In his post announcing the removals, Nathaniel Gleicher, Facebook’s head of cyber-security policy, was transparent about why the company had jumped the gun. “Typically, we would be further along with our analysis before announcing anything publicly,” he wrote. “But given that we are only one day away from important elections in the U.S., we wanted to let people know about the action we’ve taken and the facts as we know them today.”

Facebook’s efforts to be transparent with users ahead of the election stand in stark contrast with its attitude in 2016, when Mark Zuckerberg famously dismissed the notion that his platform helped to influence the outcome of the presidential election as a “pretty crazy idea“—a statement he walked back last year. Of course, since then myriad evidence has come out to challenge Zuckerberg’s skepticism, including evidence that Facebook contributed to a human-rights crisis in Myanmar. (In response to a report published Tuesday that found Facebook had “created an enabling environment for the ongoing endorsement and proliferation of human rights abuse in Myanmar,” a company spokesperson said Facebook “invested heavily in people, technology and partnerships to examine and address the abuse of Facebook in Myanmar,” but that there is still “more to do.”)

The company’s decision to hastily update the public is proof it’s aware of the stakes around midterms. Indeed, the blog post is just the latest in a series of efforts from Facebook to position itself as more transparent than in years past. In September and October, the company made a grand show—and a well-coordinated P.R. effort—to publicize its election “war room,” a refurbished conference room where a select team of employees are focused on eradicating disinformation, deleting fake accounts, and monitoring false news. “We see this as probably the biggest companywide reorientation since our shift from desktops to mobile phones,” Samidh Chakrabarti, who leads Facebook’s elections and civic-engagement team, toldThe New York Times, adding that Facebook “has mobilized to make this happen.” The initiative, with its hand-taped door sign, flashy monitors, and dedicated team of eager cyber sleuths, seemed specifically designed to appeal to journalists, and the press bit: the headline of the Times piece, “Inside Facebook’s Election ‘War Room,’” was echoed by the Verge, TechCrunch, NPR, NBC News, and Business Insider. The message was clear: inside Facebook, a team of specialists was working hard to prevent the issues for which Facebook has incurred public wrath.

The “war room” hasn’t been Zuckerberg’s only stab at transparency. The company has repeatedly announced purges of “inauthentic” accounts—late in October, it revealed that it had removed dozens of pages, groups, and accounts linked to Iran that were targeting people in the U.K. and the United States. About 1 million users followed at least one of the accounts, which were active on both Facebook and Instagram. For now, it’s impossible to know whether and how the accounts removed Monday were targeting midterms, though that hasn’t stopped speculation. “Could be early stage I.R.A.-style operations meant to build audience,” Facebook’s former chief security officer Alex Stamosobserved on Twitter. What is clear is that the stakes for Facebook couldn’t be higher, and that Facebook is well aware of the consequences of even the appearance of interference.