Facebook apologises for ‘emotionally manipulative’ study

But the author behind controversial research into emotional contagion maintains the effect on users was ‘minimal’

One of the authors of a controversial Facebook study into emotional states has apologised for the experiment, saying the team of researchers are "very sorry" for any anxiety caused by the study.

Facebook manipulated the news feeds of 689,003 users in January 2012 to show disproportionately negative or positive posts, in a bid to analyse how emotions can be transmitted on the social network. The researchers wanted to know if a user would respond to increased exposure to negative posts with negativity of their own, or whether they would engage with the platform less. The psychological experiment was done without the knowledge of thousands of subjects.

“My co-authors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety,” wrote Adam Kramer, one of the three authors, in a Facebook post (obviously).

The experiment would not have gone ahead without Facebook's approval, and the study has sparked outrage among those who believe that it is just one more example of the social network playing hard and fast with privacy rights.

Let's call the Facebook experiment what it is: a symptom of a much wider failure to think about ethics, power and consent on platforms.

Kramer still stands by his study, though. "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," he explained. "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out."

The study did find that exposure to postive and negative posts did make people feel correspondingly more positive or negative through a process known as "emotional contagion", although Kramer is keen to emphasise that the actual knock-on effect on users' newsfeeds was "minimal".

Legally speaking, Facebook is completely in the clear. You give Facebook the ability to mine your data for research purposes as soon as you click "agreed" on its T&Cs. And if you're angry about it, well – maybe it's best not to post on Facebook.