Facebook let researchers adjust its users' news feeds to manipulate their emotions – and has suggested that such experimentation is routine, which is seemingly how the idea got past the advertising firm's ethics committees.
In 2012, the trick cyclists, led by the company's data scientist Adam Kramer, manipulated which posts from …

Boffins? Boffins?!

Re: Boffins? Boffins?!

one of the most annoying things about this for me is that it gives a bad name to social scientists in general. I know that there is not a lot of love for social scientists on these forums, but this really doesn't help anyone.

I cannot bellieve that someone signed this off - informed consent is something that every social science undergraduate has drummed into them from the first day.

What will be interesting to see is if any of the US regualtors step in, there have been suggestions that this breaks federal law in the US: (http://laboratorium.net/archive/2014/06/28/as_flies_to_wanton_boys)

0% creepy, rilly. You get more emotional response manglement whenever a politician or a spokesperson appears on the tube and uses newspeak to whip up a frenzy. Extra buffer stuffing when it's an economist working for our top clown court.

"our goal was never to upset anyone"

Oh, on the contrary, you manipulated peoples feeds to show them negative posts in hope to make them negative (as that was the outline of your research) whilst doing the opposite to the other half - therefore you fully intended to upset half of your research 'volunteers'...

Re: "our goal was never to upset anyone"

@Sampler

Yeah - that's the way I read it too.

Studies like this are always problematic as there's no real way to conduct it without keeping people in the dark. When a new drug is being tested it will go through double-blind tests against a placebo. Participants are fully aware that they may be taking a placebo but it works because there is no way to tell if you are in the control group or not.

With an experiment like this, telling people the parameters of the study would expose the whole thing. Sure, it's possible that all your friends are just sad all the time but if you notice that your feed is now a bit less or more upbeat than normal then you're going to have a pretty good idea which group you are in.

So, this research is near impossible to conduct while still getting informed consent. The question has to be asked, then, whether the research is valuable enough to warrant what has happened. I would suggest not. It's important that science not simply settle for 'common sense' answers but I don't think it would be too detrimental to our understanding of emotions if we just assume that people exposed to predominantly negative information take on some of that negativity themselves.

After all, advertising works so it's really not a stretch at all to assume that 'emotional advertisement' works too.

Re: "our goal was never to upset anyone"

Re: "our goal was never to upset anyone"

It could have been done reasonably well. Throw up a notification asking if people are willing to be part of an experiment on social behavior, which may alter their experience of Facebook for the next week. Explain that providing any more details about the experiment would alter people's behavior and invalidate the results, but provide more information about the study and which group people were in after it's over. Not complete information, but enough for reasonably informed consent, and far better than how they provided no information and obtained no consent.

Re: First comment: Useless research from the "DUH!" dept.

Sixth comment.

Next it will be the individually crafted reorder of your newsfeeds being monetized for ad-slinging (actually product placement) purposes. Welcome to the world of tomorrow, which Orwell, Huxley, Ira Levin and Gibson could not even dream of.

Re: First comment: Useless research from the "DUH!" dept.

Kittens, politics, and dinner

Were psychologists able to determine if they have a greater influence on news feeds than Facebook bugs do? I think they're running one of those "eventual consistency" databases, where "eventual" extends to infinity.

I just got more irritated and pissed off with Facebook for not giving me what I want to see in my newsfeed, which is ALL posts that I've elected to receive, most recent first. In other words, with none of their fancy filtering to try to determine which ones I might want to see applied. It was lonjg ago that I decided that I don't fit anyone's standard profiles, and FB is not an exception to that.

Re: @silent_count

That's the whole point of FB

It is an advertising platform.

Regardless of what a few mind-boffins do, advertising is intended to mess witth your mind and get you to buy crap you don't want and is MUCH worse than a slight emotional wrping due to the order you receive news in.

Getting uppy about this while placidly sucking on the advertising teet makes no sense alt all.

Not to give fecebook a pass, but ...

The _Atlantic_ article states, in part: "The backlash [against the research methodology] in this case, seems tied directly to the sense that Facebook manipulated people -- used them as guinea pigs -- without their knowledge, and in a setting where that kind of manipulation feels intimate." The outrage, then, seems to be hinged on people thinking that fecebook does NOT manipulate people, that a company whose revenues derive in no small part from adverts would not manipulate its users. Hokay. Not sure how this is different to measuring the effectiveness of said adverts (using illustration A, 24% of viewers clicked the link, while using illustration B only got 15% of viewers to click") -- viewer sees stimulus, viewer takes action, fecebook keeps track, and correlations are hypothesized -- as someone posted above, if a service is free you are not the customer you are the product. Having said that, if fecebook does manage to gets its hands slapped this time, hallelujah.

Re: Not to give fecebook a pass, but ...

Tracking your response to adverts is one thing. Going out to purposely manipulate peoples news feeds to see if you can make them angry or happy or depressed and not telling anyone that they have been opted into an experiment that they have no way of opting out of is a completely different thing

All you guys got it wrong

Krammer: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

So, this was actually a brand building exercise. Just like a toothpaste company's research, except that it made you and your kids not so cheery.

Safety guidelines crossed

Re: Safety guidelines crossed

7m subjects, half being pushed negative, half positive. So with 3.5m being made more depressed than they would otherwise have been, I'd expect at least one to have committed suicide that wouldn't otherwise have done.

A couple of simple tricks:

1. Set your News Feed to show "Most Recent" rather than what some algorithm wants to promote.

2. Go through all your friends and individually set who you want to follow (there will be some people you want to keep on your friend list but don't feel the need to read about their breakfast every day).

Re: A couple of simple tricks:

Also 'most recent' still has a scrambled order. Random posts are selected to be ordered based on the date of the last comment, not the date they were posted. How this order differs from the top-stories order I'm not sure, but its not what I'd expect from 'most recent'.

I have seen the future....

""At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."

The guy is Missing the point. Its Facebook with its arrogant attitude to people and the fact that not all of us feel the need to plaster minutiae of our lives on the internet that leads many to avoid visiting Facebook, a fact I am relived to see is dawning on some of those I know who are Facebook users, and are cutting down or ceasing their use of the data-mining service. Many of us can see NO value in joining, For my part I feel that joining would lead to a deficit in privacy and the quality of contact with friends, I prefer to relate to my friends individually - I feel they are worth that time and effort rather than employing the electronic scattergun approach of spraying dull details of everyday across t'interwebz

Simply put Mr Kramer, Facebook really isn't as compelling as you feel it should be, and - happily - people are starting to question their participation in it. I have never been and never will bne part of it, companies that increasingly demand a facebook or twitter account to interact with them should take not of the fact that I am p[art of a sizeable section of web user, one that may well grow in the future.

This is as sinister and underhanded as subliminal advertising, and it is time that this "service" was more strictly controlled - those running it clearly have no morals when it comes to messing with other peoples heads.

Dirty devious bastards

So they acknowledge direct psychological manipulation of unknowing users? That's a pretty stupid thing to do considering that amongst the vast number of FB users there must be at least some who are at one time or another in a mentally fragile state.

The Ts & Cs may refer to use of data for research purposes, but this amounts to use of the users minds for research purposes - and that sure isn't covered.

It doesn't surprise me that FB doesn't see any difference, or care if there is, but it worries the hell out of me that the researchers didn't see any need to consider the ethical implications of what they were doing.

T&Cs

OK I signed up and accepted that they could use data I provided for research.

Where is it written that actually that means I'm signing up as a test subject? This is an experiment I could have been involuntarily involved with. If a drugs company did it there would be hell to pay.

Doesn't this actually contravene human rights legislation. Maybe the EU need to get involved and find out how many EU citizens were used as human Guinea pigs without consenting.

Forget creepy, what about the Medical implications?

Product

As many here have pointed out before, if you get it for free you are not the customer you are the product.

What Farcebook are forgetting is that not only are their members the product, they are the resource, any company that hopes to have any kind of longevity in its chosen business knows that managing and shepherding its prime resources is fundamental to success.

When you treat resources as a limitless, maintenance free supply of product that will always be there; someday you are going to wake up with a nasty shock, it will all be gone.

hit them where it hurts..

.. Simply refuse to have any dealings with a company that advertises on Feb. If it's your bank or power supplier then switch and tell them that you are switching because by running adverts on Facebook they have supported extremely dubious social engineering experiments.

Are these folks licensed 'practitioners'?

Makes you wonder how many idiots (People already suffering manic depression etc.) went out and injured others (the normal trolls in many cases) or themselves because they were being manipulated. It sounds as if they were 'practicing' (researching) psychology/psychiatry which to the best of my knowledge information and belief requires a license.

Our government has been known to 'experiment' with what it seems to think is ITS property (People) for years.