In Defense of Facebook’s News Feed Experiment on User’s Emotions

I’ve read a lot of articles this past week reacting to Facebook’s experiment with the News Feed by testing emotional content and displaying it to users to measure reactions. Unlike most people, I am not bothered by this experiment. I’ll explain why.

The synopsis of this controversy is that Facebook collaborated with Princeton University to display status updates that demonstrate positive and negative emotions and measured the response and behavior from users. For instance, did they interact with the author of the update? Did they increase or decrease their use of Facebook? Did the users who read emotionally-charged updates take on similar behaviors themselves? It demonstrates how we internalize and react to content shared on social media. The full research paper is embedded at the bottom of this post. [You can read the original PDF, here.]

This social experiment was met with harsh reactions. Everything from the moral dissections of the experiment to whether it was legally permitted in their Privacy Policy. And the dissertations slamming this experiment were lengthy, passionate and demonstrated a sincere sense of shock.

I didn’t really care. It’s not that I don’t care about privacy or censorship – I do – I simply don’t see this experiment as anything more sinister than a high school science experiment. There wasn’t any political or commercial intent around this (beyond perhaps increasing Facebook MAUs). It was intended to measure the actual impact of people expressing emotion online.

There has been less of an outcry about Facebook’s move to enable retargeting across their platform via the websites you visit. That has serious privacy risks. Retargeting can also be used with more malicious outcomes against the user’s will. Sure, sure, it will empower Facebook to deliver more relevant ads… but is that something that users need now? More advertising? I’ll save this discussion for a future post.

In general, I don’t find fault in a publisher’s right to conduct split-tests or multivariate testing to induce a better outcome for the reader. You know who does this? Online publishers, company websites, retailers and hundreds of thousands of marketers for their websites. If a headline, colors, copy or images evokes an emotional response from me, it persuades me to take action. If we are to crucify Facebook for conducting this experiment, we should also call into question the merits of split testing in marketing.

Do I want to be a specimen in their experiments? No, not without gaining my explicit consent. Considering that it’s Facebook’s services and resources, they can conduct experiments if they wish. In a number of these critiques about the experiment, there’s a strong sense of entitlement. It’s as of Facebook is a government-funded utility that owes their missions and purposes to the people. It’s a company. They are allowed to conduct research and development if it brings shareholder value. And whether you agree or not, this scientific research demonstrates thought leadership, thus increases the equity of Facebook as one of the largest data warehouses known to date.

I’m under the impression that the Facebook PR and News Feed teams have read many of the concerns expressed by reporters why users could feel deceived. I expect the company will cut the number of science experiments on their users or at least inform users about them. That said, I am a bit disappointed Facebook doesn’t appear to show any interest in discussing the lessons learned and interpretations of the data in their Newsroom blog. Their cryptic replies to reporters only questions the company’s motives behind the experiment further.

And if you are so opposed to this use of your data, then there is only one solution I can suggest. Delete your account. I’m almost positive that most users won’t do it, though. Or, you can be the curmudgeon among your friends and spread negativity, as the study suggests. 😉