Does Facebook Shelter People from Different Opinions?

Below:

Next story in Science

Many people today get their news via Facebook, but most probably
give little thought to how the social media network filters the
stories they see.

A new study of more than 10 million anonymous Facebook users
found that the site's news-filtering algorithms produced only a
small change in the proportion of stories people saw that
challenged their political beliefs. Rather, the users' own
decisions — for instance, which stories they chose to click on —
had a much larger effect on the stories they read.

Understanding how the social media site exposes readers to
viewpoints they disagree with could have serious implications for
democracy, the researchers — some of whom are Facebook employees
— said in the study, which was published online today (May 7) in
the journal
Science. [ The
Top 10 Golden Rules of Facebook ]

Users don't see everything their friends
post on Facebook, said David Lazer, a political and computer
scientist at Northeastern University in Boston, who was not
involved with the study but wrote a commentary on the work,
published in the same journal. Facebook uses vast amounts of
behavioral data to determine what the user might be interested
in, which may only be a small fraction of the content people in
your network post, he said.

"In many ways, it's a very useful service for users," Lazer told
Live Science, "but what are the broader implications of this
curation? What aren't we seeing, and should we somehow be
worried?"

Unwelcome news

Previous research has shown that people tend to read and share
news that agrees with their political beliefs, rather than
news that challenges their views. But to what extent do
Facebook's algorithms influence the news people see and read?

In the new study, researchers from Facebook and the University of
Michigan, Ann Arbor, measured how 10.1 million American Facebook
users who reported a political affiliation shared some 7 million
different news links, between July 2014 and January 2015.

First, the researchers looked at the proportion of people's
friends who had
opposite political beliefs. About 20 percent of the study
participants who described themselves as liberals had friends who
were self-described conservatives, and 18 percent of
conservatives had friends who identified as liberals, the
researchers found.

Next, the researchers looked at how much news users saw in their
news feeds that didn't align with their political beliefs, dubbed
"crosscutting" content. News was classified as "hard" if it could
be considered national news, politics or world affairs, and
"soft" if it pertained to sports, entertainment or travel. Each
hard news story was assigned as liberal or conservative based on
the average political beliefs of the users who shared it.
[ 7
Great Dramas in Congressional History ]

This is where Facebook's ranking algorithm comes in. The
algorithm filters the content a user sees in his or her news feed
based on how often the individual uses Facebook, how much the
user interacts with certain friends and how often the user has
clicked on certain news-feed links in the past.

After Facebook's ranking algorithm was applied, liberals saw
about 8 percent less conservative content than that shared by
their friends, whereas conservatives saw about 5 percent less
liberal content, the researchers found.

But the users' choice of what to read — based on the links they
clicked on — had a much larger effect on the amount of
crosscutting content users were exposed to. The researchers
estimated that the likelihood of conservatives clicking on a
liberal article in their news feed was about 17 percent, whereas
liberals would click on about 6 percent of the conservative
articles they saw. On average, Facebook users clicked on about 7
percent of the hard news presented in their feeds, the
researchers said.

Overall, Facebook's news feed algorithm produced about a 1
percent change in the proportion of news that challenged users'
political beliefs, while the users' own decisions about what
to click caused a 4 percent decrease in the proportion of such
content in their feed. Thus, a user's choice of whether to read
crosscutting stories or not appears to be a much more important
filter than Facebook's algorithm, the researchers said in the
study.

But not everyone interpreted the findings that way.

Controversial methods

The results "conclusively show that Facebook's news-feed
algorithm decreases ideologically diverse, crosscutting content
people see from their social networks on Facebook by a measurable
amount," said Zeynep Tufekci, a sociologist at the University of
North Carolina at Chapel Hill, who was not involved in the study.

Comparing how
Facebook's algorithms affect what content readers view with
the users' own choices of what to read "is like asking about the
amount of trans-fatty acids in French fries, a newly added
ingredient to the menu, and being told that hamburgers, which
have long been on the menu, also have trans-fatty acids," Tufekci
told Live Science. In other words, people's bias toward reading
news they agree with has long been known, but it's still worth
finding out whether Facebook's algorithms introduce additional
bias.

The researchers acknowledged that the study has limitations. For
one, the findings were limited to Facebook users, who may behave
differently than users of other social networks, such as Twitter,
and also from the U.S. population in general. Also, Facebook
displays summaries of articles in the news feed, so users may be
exposed to some of that content without clicking on it, the
researchers noted. In addition, the study was limited to people
who self-identified their political affiliation, Tufekci pointed
out.

Nevertheless, the study's findings are notable, and require
"continued vigilance," Lazer wrote in his commentary. "A small
effect today might become a large effect tomorrow, depending on
changes in the algorithms and human behavior."

In fact, on April 21, long after this study was conducted,
Facebook announced three major changes to its newsfeed
algorithms, which aim to ensure that a user sees updates from
"the friends you care about," Lazer said. "It is plausible,
however, that friends that Facebook infers you to care about also
tend to be more ideologically aligned with you as well,
accentuating the filtering effect."

The findings come on the heels of a controversial study published
in June 2014, in which
Facebook removed positive or negative posts from hundreds of
thousands of users' news feeds — without the users' awareness —
to see if it influenced people's emotions. That study, published
in the journal Proceedings of the National Academy of Sciences,
caused a public outcry over what some perceived as unfair
manipulation of the site's users.