In fairness, some of those people also believe that Google should have a private militia and that Eric Schmidt should become America’s chief executive, so it’s no surprise that they would lend their support to the company’s right to do essentially whatever it wants to its users. But there are other, more rational people who also believe that Facebook should be allowed to “tweak its algorithms” — which is how they characterizethe experiment — without thinking of the repercussions.

The problem is that conflating Facebook’s experiment with the changes tech companies make to countless other algorithms every day is a bit like saying that a doctor shoving poison pills down a person’s throat is okay because other doctors were writing legitimate prescriptions at the same time. It’s a willful misrepresentation of a disconcerting experiment that should be discussed instead of being accepted as part of the modern, algorithm-driven world in which we’ve decided to live.

This wasn’t a simple tweak meant to see what might happen if Facebook were to change its News Feed. The company’s decision to increase the number of news articles or decrease the number of promotional status updates shown in the stream of seemingly-random content fits that description. Its efforts to determine the emotional effect that it can have on its users by toying with nearly 700,000 people differs in all but execution.

And when this study is considered for what it actually is — a psychological experiment that affected hundreds of thousands of people — instead of as a technological issue, it seems far more sinister than its supporters would have us believe. As Max Masnick, a researcher who does “human-subjects research every day,” explains in the Guardian’s report on the study:

As a researcher, you don’t get an ethical free pass because a user checked a box next to a link to a website’s terms of use. The researcher is responsible for making sure all participants are properly consented. In many cases, study staff will verbally go through lengthy consent forms with potential participants, point by point. Researchers will even quiz participants after presenting the informed consent information to make sure they really understand.

Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety.

While we’ve always considered what research we do carefully, we (not just me, several other researchers at Facebook) have been working on improving our internal review practices. The experiment in question was run in early 2012, and we have come a long way since then. Those review practices will also incorporate what we’ve learned from the reaction to this paper.

It’s clear why anyone concerned about the ethics involved with manipulating the emotions of hundreds of thousands of people without explicit consent view Facebook’s experiment as a mistake. So why do some people insist on defending Facebook’s supposed right to experiment on its users, despite being told that checking a box on a Terms of Service agreement isn’t the same as agreeing to participate in a psychological experiment?

Perhaps it’s because the Valley culture has made it easier to think of millions or billions of people as little more than data points used to draw revenues from advertisers. Because this study technically required just a few tweaks to an algorithm, it’s easy to view it as a harmless experiment meant to gather more data, which has become increasingly sacred to the Valley. It’s hard to think that a billion data points is actually composed of individual people who might worry about Facebook’s ability to manipulate their emotions instead of bits and bytes.

Or maybe the simpler explanation, which Holmes suggested in his post about the study and its ramifications, is better: Facebook just doesn’t give a shit about the people who use its service and is more than willing to experiment with their mental health because of the culture of contempt that it has created within the confines of One Hacker Way. It knows that there are people on the other end of those algorithms — it just doesn’t care. It exists to make people spend more time online, not to worry about how other people feel.

Still, it’s hard to imagine why people outside the company would defend its actions. Maybe they also don’t care about other people. Maybe they’ve become so jaded that the idea of a technology company performing psychological experiments seems like a normal Saturday. Or maybe they want to give Facebook the chance to make everyone feel like they do, thus ending the criticism they face for suggesting that Eric Schmidt should be placed in charge of an entire goddamn country. Take your pick — no matter which motivation you choose, the truth is that we must now live in a world where Facebook’s right to unethical experiments is defended.

As Holmes put it, it might be time for Facebook to make all of our News Feeds just a little bit happier. We’re going to need it to get through this week — and perhaps the rest of our lives.

Facebook has introduced Scrapbook, a new feature that allows parents to share and collect images of their children in one place without requiring them to worry about tagging their kids’ face with each other’s names just to make sure they don’t miss what the other person has posted. [Source: Facebook]

“For all the clumsy rhetorical lip service [former Yahoo News head] Guy Vidra pays to The New Republic’s hallowed intellectual traditions, this is what his vision of a nimble digital news product finally translates into: a vaguely journalistic veneer strategically designed to conceal a rancid interior of ‘elevated’ advertising.”

Indian e-commerce company Flipkart is said to be raising $600 million in its latest bid to compete with Amazon. The company is also said to have garnered a higher valuation with this funding round — quite the feat, considering it was previously valued at around $11.5 billion. [Source: The Economic Times]

Here comes another unicorn: Sprinklr, a New York-based marketing company, has raised $46 million at a $1.17 billion valuation. The funds will be used to help the 700-person company expand its marketing platform. [Source: Fortune]

Curator, the tool Twitter created so the media could find and share tweets with its audience, is now available to the public. Because if there’s anything people wanted to see more of, it’s tweets randomly inserted into blog posts, television spots, and other forms of media. [Source: TechCrunch]

A court in France has decided not to ban Uber’s low-cost services until the country’s highest appeals court, or its supreme court, weigh in on the constitutionality of a new transport law. [Source: The Wall Street Journal]

Tinder is refocusing on its spam-fighting efforts in the wake of reports that movie studios are using the service to promote their movies, scammers are attempting to steal information via the app, and pranksters have created tools that trick heterosexual men into flirting with each other. [Source: The Verge]

Uber offers drivers whose accounts have been deactivated a choice: attend a class that requires them to pass an exam, or take a class that doesn’t. The latter has been informed by Uber employees, and the company has sent thousands of drivers to it, according to a report from BuzzFeed. Why is that a problem? Because Uber isn’t supposed to provide its drivers with formal training; doing so makes them bona fide employees, not independent contractors. [Source: BuzzFeed]

Flipboard users will now be able to collect articles and share them via private magazines visible only to members of certain groups. The feature is aimed at students working in the same class, companies sharing press coverage, and other groups that might want an easy way to share Web pages with each other without having to use public tools like Facebook or Twitter. [Source: Flipboard]

T-Mobile has tasked its customers with creating a real-world coverage map that makes it easier to tell where its service works and where it doesn’t. Instead of guessing at where its customers will get service — which is what other carriers do, the company claims — it’s asking people to verify its predictions so it can be more honest with consumers. [Source: T-Mobile]