Facebook\’s News Feed manipulation touches off firestorm

Facebook Inc. gave social media skeptics cause for alarm this weekend after it came to light that the Internet company manipulated users\’ News Feeds as part of a social science experiment.

The popular social network provider tweaked the experience of nearly 700,000 users as part of a joint study alongside the University of California in San Francisco and Cornell University. The research found that when shown negative posts, people tended to express negative expressions, as well as vice versa — one sign that \”emotional contagion\” can take place without awareness of it.

Bloomberg News

Mark Zuckerberg, founder of Facebook

While Facebook
and the co-authors published the research last fall, a report in the New Scientist appears to have brought the news feed manipulation to light.

No one, for the most part, seems to be saying the research ran afoul of the law. After all, Facebook users give permission to the company for data research and testing. Plus, the report did pass a review board designed to screen for such issues. But the revelation tapped into a growing fear about what types of data social media companies have about their users, how data are being used, and whether there are any ethical parameters in place to protect the little guy.

James Grimmelmann, professor of law at the University of Maryland, wrote on his blog, The Laboratorium:

\”This study is a scandal because it brought Facebook’s troubling practices into a realm — academia — where we still have standards of treating people with dignity and serving the common good. The sunlight of academic practices throws into sharper relief Facebook’s utter unconcern for its users and for society.\”

Another issue: should study subjects have been notified about the study? Susan Fiske, professor of psychology at Princeton University, who edited the study, told The Atlantic that she was initially concerned with the methodology when she first saw it, but was told that a local institutional review board green lighted the study on the grounds that Facebook News Feeds are often manipulated.

Nonetheless, she said: \”People are supposed to be, under most circumstances, told that they\’re going to be participants in research and then agree to it and have the option not to agree to it without penalty.\”

Facebook, for its part defended its research tactics. In a statement provided to publications, including Business Insider, the company said:

\”We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.\”

And there was no shortage of people who sided with Facebook\’s ethical decision on this particular issue. Tal Yarkoni, a research associate in the Department of Psychology at the University of Texas at Austin, wrote that \”the suggestion that Facebook \’manipulated users’ emotions\’ is quite misleading.\” One of his justifications:

\”Framing it that way tacitly implies that Facebook must have done something specifically designed to induce a different emotional experience in its users. In reality, for users assigned to the experimental condition, Facebook simply removed a variable proportion of status messages that were automatically detected as containing positive or negative emotional words.\”

Marc Andreessen, the venture capitalist with an irrepressible Twitter presence, had this to say:

Story Conversation

About The Tell

The Tell is MarketWatch’s fast and engaging look at trends and themes in the day’s markets. Drawing on our reporters, analysts and commentators around the world, as well as selecting the best of the rest online, The Tell is all about the pulse of the markets through news, insight and strategic information to help you make the best investing decisions. Got a tip? Tell us at TheTell@MarketWatch.com