Posted
by
samzenpus
on Thursday July 03, 2014 @08:04AM
from the to-friend-or-not-to-friend dept.

redletterdave (2493036) writes Facebook chief operating officer Sheryl Sandberg said the company's experiment designed to purposefully manipulate the emotions of its users was communicated "poorly". Sandberg's public comments, which were the first from any Facebook executive following the discovery of the one-week psychological study, were made while attending a meeting with small businesses in India that advertise on Facebook. "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you."anavictoriasaavedra points out this article that questions how much of this outrage over an old press release is justified and what's lead to the media frenzy. Sometimes editors at media outlets get a little panicked when there's a big story swirling around and they haven't done anything with it. It all started as a largely ignored paper about the number of positive and negative words people use in Facebook posts. Now it's a major scandal. The New York Times connected the Facebook experiment to suicides. The story was headlined, Should Facebook Manipulate Users, and it rests on the questionable assumption that such manipulation has happened. Stories that ran over the weekend raised serious questions about the lack of informed consent used in the experiment, which was done by researchers at Cornell and Facebook and published in the Proceedings of the National Academy of Sciences. But to say Facebook’s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic.

Facebook has done us all a favor by waking up the dumb consumer to the consequences of the idea that information wants to be free - and therefore its alright to waive all personal rights on the internet.

Testify! As was said before on/. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.

Testify! As was said before on/. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.

Facebook's ability to do that is right there in the EULA. Yes, I actually read Facebook's EULA. Anything you post on the site is yours, but they enjoy the right to use it in any way they like while it's there. So this journalist agreed to let them do that when she signed up. She likely didn't know that, because who actually reads EULA's, right? It's one more reason I'm not on Facebook.

Testify! As was said before on/. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.

I Google'd "British Journalist Facebook Photo" and received zero valid results relating to what you describe. Do you have a more direct link, or a name?

FB users can't possibly care about this issue, there's nothing for them to "wake up" from. This is a scandal because we're in the middle of summer and there's really no news. The consumer who cares about privacy left FB before the IPO. It's a total non-story.

Some people just don't care about this kind of stuff, even fairly intelligent people can be indifferent to privacy just because they're not terribly concerned about the worst that can happen.

Except that the purpose of this experiment was to play with emotions of their users. And upset was one of the expected results.

Worse: The study has military sponsorship [scgnews.com], part of ongoing experiments how to manipulate/prevent/encourage spread of ideas (like voting for an unapproved political parties or mute general discontent):

"research was connected to a Department of Defense project called the Minerva Initiative, which funds universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world."

The study did not have military sponsorship. As the headline states, one of the authors received funding from the military for another project studying social contagion. Regardless we should be unnerved by the idea that there are leading experts on social manipulation out there getting their ideas from studying Facebook and then taking their expertise to the military.

The thing that gets me is that they concluded that a person with more positive items in their feed is more likely to post positive things. This does nothing to evaluate the actual mood of the person - they might be posting positive things only to compete, or to give the appearance of being positive.

No, in this case it really *did* try changes that they thought might upset customers. Show more negative stuff and see if the users post more negative stuff. That was the whole freaking goal of the experiment.

"Dear customers. We are really sorry that you're so upset at our great study. We're super glad that we did the study but so very very sorry that you guys were upset by it. When we do it again, let's work together to find a way that you could just not be so upset about it."

Why doesn't FB just man-up and tell the truth, e.g., "Dear people who have FaceBook accounts: Until you start paying for this service, you have no say in how we operate. Our real customers, i.e., advertisers, were quite pleased with the results of our little test. They now have more insight into you and can better target their advertising to you. This, in turn, makes us more valuable to them and we make more money by growing our customer base and/or raising our advertising rates. Now go about your busin

This reminds me of this one time, as a kid, I threw a rock at someone really far away. I didn't actually want to hit them, and never thought I would. The rock nailed them square in the back... It was a really weird apology. "Um... yes, I was aiming for you, but I never thought I'd hit you! Sorry!"

I think this kind of thing happens more often that we realize. With all the TV shows where people have pranks pulled on them, I'd love to know how often they go wrong. I'm surprised that more of the pranks don't end up with the person who is pulling them getting their ass kicked, or arrested.

Facebook Experiments Had Few Limits [wsj.com]"Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real. In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures...'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior.'...The recent ruckus is 'a glimpse into a wide-ranging practice,' said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies 'really do see users as a willing experimental test bed' to be used at the companies' discretion."

So, either they went to the scientists and said "hey, we want to find something out", or the scientists went to Facebook and said "hey, we could do an awesome experiment on your users".

Either way, Sandberg sounds like an unapolagetic smug idiot who more or less said "they're our users, we do this shit all the time".

The people who run Facebook are assholes, and don't give a crap about anything more than how they can maximize ad revenue. And Zuckerfuck is a complete hypocrite about privacy -- his users get none, and he jealously guards his own.

The scientists represented to the IRB that the dataset was preexisting, and so the IRB passed on the review. It's not clear that the dataset was preexisting, though, since the study seems to indicate that the scientists were involved in the design of the experiment from the beginning. What's more, the paper itself claims to have obtained informed consent when clear there was none.

Quirkology [amazon.co.uk] by Richard Wiseman, an interesting read, is a compilation of rigorous experiments in social psychology, many of which were conducted, *gasp*, without the subjects' consent.

Retail stores do research on consumers' behaviour in order to try to sell them more sugary, salty and fatty snacks. Addiction to those nasty foods is a very real issue with health consequences. Where's the outrage?

I made friends with over thirty people on Facebook and had a vibrant social life for over a year until I found out they were all zombie accounts run by the same person. I was still OK with that until I realized that I had spent over $2,000.00 on wedding gifts and birthday presents for all my "friends". But then again, you can't put a price on friendship, right?

I didn't have enough Facebook friends who wanted to get really, really good at MobWars, so in my initial excitement I forgot to read Facebook's TOS & accidentally found myself with a dozen Facebook accounts.

Business was kind of slow that summer so I ended up writing a set of Perl modules that took care of the tedious portion of logging in to Facebook, bypassing the HTTPS redirect, logging in to MobWars, logging each character's stats, checking 'stamina' to see if the character had enough points to compl

They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.

They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.

And what do you call advertising, commercials and the nightly news? The same damn thing.

Ah... the apology that puts blame on the victim. A hall mark of abusers and sociopath's everywhere.

Now everyone would of notice that they are only apologising for the mis-communication, not the act of physiological experimentation (as if we would be OK with it if they had told us). But it goes deeper...

Notice that they put the action and apology in two difference sentences, followed quickly by a "We never meant to upset you." Putting the emotional blame back on us. As if we were just accidentally bumped bystanders, not the actual targets of the actions.

And never ever use the word "Sorry". Only the big weasel phrase "we appologise". This apology goes right along with the classic phoney apologies...

I'm sorry you that you got upset.
I'm sorry that you feel that way.
I'm sorry that you made me do that.

Importantly -- and contrary to the apparent beliefs of some commentators -- not all HSR is subject to the federal regulations, including IRB review. By the terms of the regulations themselves, HSR is subject to IRB review only when it is conducted or funded by any of several federal departments and agencies (so-called Common Rule agencies), or when it will form the basis of an FDA marketing application. HSR conducted and funded solely by entities like Facebook is not subject to federal research regulations...

Human subjects research is subject to mandatory informed consent - specific to the study being performed, you can't just have a boilerplate like the Facebook ToS - in almost all jurisdictions. For example, this is the US law Facebook undoubtably broke:

The obvious joke response is that Facebook probably gets all kinds of funding from NSA.

A more serious response is that it's not quite clear. UCSF and Cornell both participated in this project to some degree, and they're both subject to HHS regulations since they do get federal funding. Whether or not the whole project was then required to follow the rules depends on what exactly the university participants did.

"while a Facebook employee was the lead researcher, there were co-authors affiliated with institutions of higher education â" University of California, San Francisco and Cornell University â" that most certainly adhere to the requirement."

"PNAS (the journal) has a policy requiring IRB ethics review for all published studies that experiment on humans, regardless of whether academic or corporate[1]. A Cornell press release[2] says this work was also funded

I'm in trouble then. In the last couple of weeks, I've performed a number of human experiments on the website I manage, including:* Do they push green buttons more than red buttons?* Do they fill in forms more reliably if it's one big form, or split across multiple pages?* Do people finish reading a page more often if the text is in large font rather than a small?

Sure it can't be all human experimentation, or else ad agencies couldn't attempt to measure the effectiveness of their ads. Parents couldn't raise their children (e.g. "Let's try withholding cookies and see if that works!").

There must be specific parameters under which human experimentation is illegal.

I haven't seen a human subject review or impact statement mentioned in any of these/. articles. Did Facebook even do one before proceeding with this research? If so was it reviewed by an ethics panel before they proceeded with the experiment? If not, then they should definitely be held responsible for any negative outcomes.

Facebook has no compact with its users to offer fair and balanced news (if you'll forgive the expression). They are not obligated to feature any particular array of stories to anybody; in fact, we've heard over and over again how the relevance of items that appear in the news feed is skewed and unpredictable. Nobody should be relying on them for news and I don't think we should expect any more journalistic integrity from them than Buzzfeed.

My question is why is there particular outrage when they do it as part of a science experiment whereas it is widely acceptable to do the exact same thing in mass media to get revenue.

National and local news programs basically live and breath this sort of thing constantly. They schedule their reporting and editorialize in ways to boost viewership: stirring up anger, soothing with feelgood stories, teasing with ominous advertisements, all according to presumptions about the right way to maximize viewer attention and dedication. 'What everyday item in your house could be killing you right now, find out at 11'.

I don't have a Facebook account precisely because I don't like this sort of thing, but I think it's only fair to acknowledge this dubious manipulative behavior is ubiquitous in our media, not just as science experiments in Facebook.

Why not hold people not claiming to be scientists to a higher standard? It's not like their science-but-don't-call-it-science experiments are any less potentially damaging than the same behavior done by a 'true scientist'.

Actually you don't have a question but rather an ignorant and inane comment. The objection to what Facebook did has been clearly stated and you have the ability to do research on the subject to understand what Facebook did wrong (hint, it was about informed consent). You don't care about the issue at hand. Rather, your intention was to make a feeble comparison between Facebook and other media in order to make the "you too" argument - a claim which does no justify Facebook's actions (see Tu quoque logica

I fail to see how it's that different than the manipulation that mass media does, who also do not get informed consent. There is the facet of it being more targeted, but the internet is already about targeted material (hopefully done with the best interest of the audience in mind, practically speaking with the best interests of the advertiser). They just stop short of calling it an 'experiment' (in practice, they are continually experimenting on their audience) and somehow by not trying to apply scientifi

1) Their users provide the feed. Facebook just displays it.2) It isn't 'for free' as Facebook uses the data to advertise to you, and thus earns money on the content you generate.3) The example I gave was explicitly not bullshit - if it were, why would anyone interfere with it?

As has been pointed out many times, Facebook was doing their usual sort of product testing. They actively optimize the user experience to keep people using their product (and, more importantly, clicking ads). The only difference between this time and all the other times was that they published their results. This was a good thing, because it introduced new and interesting scientific knowledge.

Because of this debacle, Facebook (and just about every other company) will never again make the mistake of sharing new knowledge with the scientific community. This is truly a dark day for science.

The requirement for informed consent was ambiguous in this case. If I had been in their position, I would have erred on the side of caution, and the research faculty who consulted on this project should have been more resolute about it. If anything, it is those people who should have done the paperwork. I think their failure to get informed consent was a mistake, but I don’t believe it was any kind of major ethical violation. It does no harm to get informed consent, even if you don’t legally

As has been pointed out several times, this was not product testing. This was a psychological test which Facebook failed to get informed consent.
Science is in no way hurt by this but that you think it is shows how truly ignorant you are.

Because Facebook is not a business or a scientific endeavor. Ever notice that abbreviated, Facebook Inc. is "FBI"? More intel is gathered through Facebook in one month than all of the NSA's illegal wire tapping program.

Yes and no. They were testing something that they HYPOTHESIZED could reduce the quality of the user experience. And IIRC, that hypothesis turned out to be wrong (to the extent that one can get that from the statistics).

If all user interface modifications that lead to an improved user experience were intuitive, then Facebook would have implemented them already. They are now at a point where they have to consider things that are NOT intuitive. The idea that filtering other people’s posts in a way th

"This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you."

This is identical to saying "I don't know what we did that upset you but whatever it was I apologize". They don't get it. It basically means that they are going to continue treating their users as insects to be experimented upon and lack the moral compass to understand why what they did was wrong. The fact that they ran an experiment is fine in principle but HOW you do it matters. We insist that academic researchers run their psychology experiments by a review board and when necessary get informed consent. It's not a hard thing to do and we do it for very good reasons. Facebook has not presented any plausible reason we should hold them to a different standard.

I'm very glad I do not have a facebook account and at this point I doubt I ever will. This is simply not a company I care to be involved with any closer than I have to be.

When has the Facebook newsfeed ever NOT been manipulated and been merely a list of posts in chronological order from people you are friends with and/or follow?

It strikes me as constantly being manipulated in multiple ways and in a manner noticeable to many people. Most obvious was the "top stories" filter which purported to filter the newsfeed in some manner designed to suppress some comments and promote others.

But we don't know about the criteria for this or the motivation of other, less obvious manipulat

Facebook's "research" reminds me of the treatment that eventually led the Unabomber to drop out of civilization and seek revenge against the system from his remote cabin in the woods.

From Wikipedia: While at Harvard, Kaczynski was among the twenty-two Harvard undergraduates used as guinea pigs in ethically questionable experiments conducted by Henry Murray. In the experiment each student received a code name. Kaczynski was given the code name "Lawful". Among other purposes, Murray's experiments were focused

Every single person who feels hurt by what Facebook did should admit (to themselves) that their reason to be upset is because things like these make it obvious that THEY are not in control of their emotions. That THEY are but moats of dust taken for a ride by the world around them.I don't feel abused or betrayed or manipulated by Facebook. Not that they could. My emotions are mine, and if Facebook could alter them, I would just have to admit that I was wrong, and I would learn from it to be a better ME.Don'

You make the assumption he was -1 at the time I made my comment, and he wasn't. Interesting to note that I've been modded down for it too, maybe because someone thinks I'm just complaining because I'm Jewish, when in fact I'm not.

According to the study, two parallel experiments were conducted, one where positive comments were reduced and one where negative comments were reduced. They weren't exposing them to any additional "negative material".

And? If a news show notices that it gets better viewing figures when shows are more negative and thus changes shows to be more negative that could have a worse effect. If google changes the pagerank algorithm in a way that makes negative sites score more highly (even if it is inadvertant) then it could have a far bigger effect.

People are getting their noses bent out of shape because Facebook talked about this as a psychological experiment rather than testing a system change; what they did was no worse th

People are controlling your mind all the time. Every time you see an ad, someone is trying to control your mind to try to convince you buy something. Every time you read an article in a paper, someone controls your mind to try to get their point across. Every time you argue with someone she is trying to control your mind by getting her point across. Etc.

When I see an ad or get into an argument with someone they usually don't have a billion-dollar program of research tracking my own and my friends' behaviour and then covertly adjusting what I see and hear to get their way.

Really? Where was it stated that you will become part of an experiment when you signed up for a Facebook account? Where was the disclosure that the experiment was about to begin and giving you the option to opt out?

I find interesting that people think EULA's are blanket excuses to do anything. In fact they are not. Court decisions have stated that EULA's are limited in scope. Facebook's EULA is too broad and therefore has no legal weight.