Facebook Just Opened Up Its Data to Election Researchers

CARLY CASSELLA

11 MAY 2019

After years of denial, Facebook appears to have acknowledged the political power of its platform. The dangers that fake news can pose are immense, and the company knows that it doesn't have all the answers.

It's clear that Facebook needs help, and this spring, ahead of elections in Taiwan and Europe, the business has made an unprecedented move. Using a third party non-profit, it has opened up its data to 60 outside researchers from 30 academic institutions in 11 different countries.

"We hope this initiative will deepen public understanding of the role social media has on elections and democracy and help Facebook and other companies improve their products and practices," the company announced.

Among the independent projects selected by the international body of peer reviewers (and without input from Facebook), topics include: hyperlinks in Taiwan, the effects of peer sharing, false news in Chilean elections, hyperpartisan news in Brazil, and the 'shareworthiness' of real and fake news in Europe.

As the Social Science Research Council (SSRC) explains, these grantees will have access to data from public Facebook accounts to research the spread of disinformation and, more specifically, "to deepen our knowledge of how social media platforms were used in elections in Italy, Chile, and Germany and how their use may influence public opinion in Taiwan."

The latter is a country where Facebook disinformation is rampant, and experts are worried that it could pose a threat to the nation's democracy in light of the nation's upcoming elections.

But not all of the studies are so specific. Researchers at Ohio State University, for instance, are planning on using the data to figure out broader patterns of fake news shared across different social media networks.

"People can't reliably tell you, 'I usually share stuff I haven't bothered reading in middle of the night, in spring, on weekends,'" R Kelly Garrett, a member of the team, told The Verge.

"People don't know or have incentives not to tell you the truth."

The Facebook data will allow researchers to track the popularity of news items and other public posts across social media platforms. It will also provide information on Facebook ads related to politics or other issues in the US, UK, Brazil, India, Ukraine, Israel and the EU.

Each of these contracts is an attempt to ensure researchers can independently publish whatever they find without Facebook input, while at the same time encroaching as little as possible on the privacy of Facebook users.

That second part, however, has proved to be the bigger challenge. Both Facebook and the SSRC claim that the researchers' usage will be logged and limited, protecting private data through several layers of protection.

In the end, however, the founder of SSRC says it's all a bit of a balancing act, between protecting user information and protecting society from fake news.

Already, Facebook has taken down over two billion fake accounts, but like a game of whack-a-mole, new users just keep popping up, manipulating voters and spreading hate speech and misinformation.

"Elections in India are already underway, the European Parliamentary elections will take place in short order, and the US presidential primary campaigns have begun in earnest. Concerns about disinformation, polarisation, political advertising, and the role of platforms in the information ecosystem have not diminished.

"If anything, they have heightened."

The crusade is no doubt important, but some of Facebook's partners have proved a bit questionable. The organisation overseeing this particular project is Social Science One, and its founder has gotten in trouble with Facebook before for data harvesting.

What's more, last week Facebook defended its partnership with Check Your Fact, an organisation that is known for its ties to white nationalists and is also funded by the Koch brothers.

While it's admirable that the largest social media platform is taking action on fake news and also attempting some level of transparency, if Facebook continues to align itself with politically motivated organisations and funders, the company's conclusions might not be trusted by the general public or policy makers.