This comic references the recent revelation that Facebook engaged in a "psychological experiment" by selectively showing users more "positive" or "negative" posts on their news feed and recording the users' comments to see if the change affected the positivity or negativity of their posts. Further experiments have since been revealed [http://online.wsj.com/articles/facebook-experiments-had-few-limits-1404344378 such as one that tested security measures by locking users out of their accounts]. Here, Megan is commenting on the fact that, while the media is calling this control over what content the user sees "unethical," Facebook, and other companies like Google, must, one way or another, control what content the user sees, whether to present users with a limited selection of all postings, or to tailor ads to particular users; even if the regular algorithms are not set up for psychological experiments, they are still "manipulating" what posts users see or don't see. As Megan points out, no one really knows what the "normal" constraints are of the algorithm which chooses which posts are shown on news feeds. This comic is parodying the strong reaction to what is basically already a common practice.

This comic references the recent revelation that Facebook engaged in a "psychological experiment" by selectively showing users more "positive" or "negative" posts on their news feed and recording the users' comments to see if the change affected the positivity or negativity of their posts. Further experiments have since been revealed [http://online.wsj.com/articles/facebook-experiments-had-few-limits-1404344378 such as one that tested security measures by locking users out of their accounts]. Here, Megan is commenting on the fact that, while the media is calling this control over what content the user sees "unethical," Facebook, and other companies like Google, must, one way or another, control what content the user sees, whether to present users with a limited selection of all postings, or to tailor ads to particular users; even if the regular algorithms are not set up for psychological experiments, they are still "manipulating" what posts users see or don't see. As Megan points out, no one really knows what the "normal" constraints are of the algorithm which chooses which posts are shown on news feeds. This comic is parodying the strong reaction to what is basically already a common practice.

Line 17:

Line 17:

It is as if your neighbor was spying on you while you left all your shades open, but you felt it to be inappropriate to find out what he knew about you because that's his business. Asking for the source code might similarly be equivalent to asking for the specifications of the binoculars your neighbor used for spying.

It is as if your neighbor was spying on you while you left all your shades open, but you felt it to be inappropriate to find out what he knew about you because that's his business. Asking for the source code might similarly be equivalent to asking for the specifications of the binoculars your neighbor used for spying.

+

+

In the comic [[743: Infrastructures]] the same issues with Facebook and open source.

Revision as of 09:51, 5 July 2014

Title text: I mean, it's not like we could just demand to see the code that's governing our lives. What right do we have to poke around in Facebook's private affairs like that?

Explanation

This explanation may be incomplete or incorrect:First draft.If you can address this issue, please edit the page! Thanks.

This comic references the recent revelation that Facebook engaged in a "psychological experiment" by selectively showing users more "positive" or "negative" posts on their news feed and recording the users' comments to see if the change affected the positivity or negativity of their posts. Further experiments have since been revealed such as one that tested security measures by locking users out of their accounts. Here, Megan is commenting on the fact that, while the media is calling this control over what content the user sees "unethical," Facebook, and other companies like Google, must, one way or another, control what content the user sees, whether to present users with a limited selection of all postings, or to tailor ads to particular users; even if the regular algorithms are not set up for psychological experiments, they are still "manipulating" what posts users see or don't see. As Megan points out, no one really knows what the "normal" constraints are of the algorithm which chooses which posts are shown on news feeds. This comic is parodying the strong reaction to what is basically already a common practice.

Accumulation, control and analysis of user-generated information can be a part of the terms of service/end user license agreement of a Website or software. In such a scenario, the user has effectively signed his/her consent to being part of such research. Unfortunately, most users don't read the terms before clicking the "I agree" option, so it can come as a shock when the service uses the data in a way the user hadn't anticipated.

The title text ironically/sarcastically accepts that Facebook has access to all of its users thoughts (through posts/messages and photos), and they can read them for research (or whatever other) purposes, but contrasts this with a suggestion (which likely mirrors how Facebook would respond to such a request) that Facebook's code is private and can not be revealed to us. The title text basically appears to be musing that this is backwards, and our personal data should be considered MORE private than Facebook's programming code, which may be proprietary, but is not personal private data.

It is as if your neighbor was spying on you while you left all your shades open, but you felt it to be inappropriate to find out what he knew about you because that's his business. Asking for the source code might similarly be equivalent to asking for the specifications of the binoculars your neighbor used for spying.

Transcript

Megan: Facebook shouldn't choose what stuff they show us to conduct unethical psychological research.

Megan: They should only make those decisions based on, uh...

Megan: However they were doing it before.

Megan: Which was probably ethical, right?

Trivia

The original comic image had a typing mistake. Randall wrote what twice in a row - first at the end of the first line and then again at the start of the second line. This is a classic optical illusion - but he got caught in it by himself, because he obviously did not do it on purpose, since he quickly corrected the error and loaded the current version up on xkcd.

Discussion

Not every xkcd fan is from the US, Randall has to keep the comics global.108.162.210.242 06:04, 4 July 2014 (UTC)

"Randall writes "what" twice, which is a classic optical illusion." So - did it he do this on purpose (I fail to see the connection with the subject), or is it just the explanation of why he missed the typo he made? Jkrstrt (talk) 07:03, 4 July 2014 (UTC)

It's very deliberate. The illusion demonstrates what the brain chooses not to see. Facebook is making some content not visible to us as an experiment. There really is far less subtext to this than you think there is. There isn't some deep meaning. It was an experiment to see if we would see it. 173.245.56.152 07:09, 4 July 2014 (UTC)

People here believe Randall is God. They think that even his mistakes are very deliberate. Fortunately now we know for sure he made a mistake, because he corrected it after a few hours. 173.245.50.84 16:23, 4 July 2014 (UTC)

I have added this as a trivia. Of course it was a mistake. Else he would have hinted at it in the title text and not corrected it later. Kynde (talk) 10:03, 5 July 2014 (UTC)

My version of the comic does not have the repeating what. Either it was a typo that has been fixed or there are 2 versions of the comic and Randall is performing his own experiment to judge our reactions to the different comics (I didnt sign any agreements.)173.245.54.183 13:31, 4 July 2014 (UTC)

"Similarly, what the text is saying is we have no right to peer into the algorithms that do that snooping because it belongs to Facebook and it wouldn't be fair to them for us to see it." I think the title text is actually saying the opposite. "it's not like we could just demand to see the code that's governing our lives". It looks like it's being sarcastic, since anything that runs our lives should be our business by default. 108.162.237.161 08:05, 4 July 2014 (UTC)

Then again, it's not really supposed to be governing our lives, is it? Any impact it has your life is because you gave it the permissions and information to do so, which was voluntary (by sharing your selfies and rants under their terms) and not mandated by an overreaching government. I agree that the text is sarcastic, but in a different way than you mentioned. 103.22.201.239 10:05, 4 July 2014 (UTC)

I was reading the title text to be a reference to open source code and the more zealous belief that ALL code should be open source. Not necessarily making a comment on it, so much as trying to raise the point (almost as a troll) to compare privacy concerns with access to source code.108.162.216.91 08:10, 4 July 2014 (UTC)

I read it the same way. Seems more of a comment on how if this was under the GNU AGPL then it wouldn't have been a surprise, and people would know just what they were getting into. 173.245.56.162 19:18, 4 July 2014 (UTC)

I read it and couldn't understand what what she was saying. 108.162.222.50 08:37, 4 July 2014 (UTC)

I read the title text as bitter sarcasm. And it plays in to the message in another comic, I don't know which, about someone being warned not to place his private information in the custody of another without strict limits on the power of that other. (I'm playing the world's tiniest violin" was the punch line on that one. Also used by The Kids In the Hall!)
The impact by those who manage and manipulate information is seldom clear and both it's motivation and it's impact on our decisions remains not only largely unnoticed in daily life but also unknowable. Just because we give control of information to another doesn't mean we agree to be either a lab rat or open to manipulation by them whether we recognize it or not. Whether it's someone trying to achieve power (government) or someone trying to earn a profit (business), the burden-of-proof should be on them that the effect is benign. I know this sounds a bit Ayn Randian, a person whose politics I deeply distrust, but even scary people can get things right some of the time.ExternalMonolog (talk) 11:51, 4 July 2014 (UTC)ExternalMonolog

I have added a link to the comic 743: Infrastructures in the explanation (which you found you self, can I see below.) You can always answer to an earlier comment, by adding a (:) or more before your text, so it will be clear that it is a direct answer to a specific comic. Kynde (talk) 10:03, 5 July 2014 (UTC)

The title text is an oblique reference to the implications of recent SCOTUS ruling on corporations having similar rights as people (albeit to do with religion, as opposed to privacy), no? 108.162.228.41 (talk) (please sign your comments with ~~~~)

The SCOTUS ruling follows a US Supreme Court decision in the late 19th century that "A corporation is a person". Ironically, the justification for this ruling was based on a law clerk's note in the margin of a previous decision stating that the said previous decision could create the situation where a corporation has the same rights as a person. The decision at hand was to decide the validity of a presidential election and the the Supreme Court took the notes made by the clerk as law. it's clear the court knew what it's ruling meant, but it's not clear what the courts motivation was for accepting the clerks notes as if it had been an already rendered decision!ExternalMonolog (talk) 11:51, 4 July 2014 (UTC)ExternalMonolog

Tools

It seems you are using noscript, which is stopping our project wonderful ads from working. Explain xkcd uses ads to pay for bandwidth, and we manually approve all our advertisers, and our ads are restricted to unobtrusive images and slow animated GIFs. If you found this site helpful, please consider whitelisting us.