The sun rises behind the entrance sign to Facebook headquarters in Menlo Park, Calif., on May 18, 2012. The social network is facing accusations that it removed stories on conservative issues and figures from its "trending" news feed, a controversy that reflects an increase trend toward political polarization, some observers say.

At the 2006 White House Correspondents' Dinner, host Stephen Colbert famously asserted that “reality has a well known liberal bias.”

His claim was in jest, but a former Facebook employee’s contention that the site’s “news curators” routinely omitted popular conservative news from its “trending news” feed has reignited a long-running debate about online news, media bias, and what political scientists say is a trend toward increasing political polarization.

For what's increasingly a primary news source for its 1 billion daily users, Facebook could be a significant influence on what is considered true in a US election year.

The allegations about the news curators, who were described by Gizmodo as a “small group of young journalists, primarily educated at Ivy League or private East Coast universities” – could further challenge the site’s longstanding claims of technological neutrality.

"Leaning Left"?

“I was really surprised,” says Jason Gainous, a professor of political science at the University of Louisville. “I hadn’t even thought about that possibility. I know their algorithm filters out based on user preferences but the idea that they’re actually filtering out their trending stories, this is not good news for them.”

If it is occurring, such filtering could potentially alter the views of conservative users, some say.

“People tend to select information matching their political beliefs. If Facebook were systematically favoring one political perspective over another, then it would challenge this trend for those on one side of the political aisle,” writes Natalie Jomini Stroud, an associate professor of communication at the University of Texas at Austin who directs the Engaging News Project, in an e-mail to the Monitor.

The former Facebook news curator’s claim, which was contested by other curators interviewed by Gizmodo and The Guardian, sparked a firestorm of criticism from some conservative news sites. But the growing polarization of our news consumption may not require help from social media. Instead, it may be an outgrowth of the manner in which we consume our news, experts say.

With trust in government peaking in the mid-1960s and a decline in belief in established information sources, including the news media, many Americans have increasingly become polarized in their political views and self-selected into like-minded communities, says Bill Bishop, a journalist and author of “The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart.”

Increasing dominance of online news

That "clustering" tendency may be further enabled by social networking sites, which continue to usurp broadcast news and newspapers as a key central destination for news. But there are some distinctions in how users seek out news online on different platforms.

A study from the Pew Research Center found that more than half of users of both Facebook and Twitter used the platforms as a news source for events beyond their friends and family.

But while Twitter is seen primarily as a tool for keeping up with breaking news and following their favorite outlets, reporters, and commentators, Facebook functions more as a forum. Its users were more likely to post and respond to content about government and politics.

“There is research suggesting that those selecting like-minded partisan media hold more polarized political views. It’s not clear to me whether the ‘Trending’ feature would have the same effect,” writes Stroud, the communication professor in Texas. "What may be more likely is that the ‘Trending’ feature influences what issues people believe are most important,” she says.

Gaming the news feed, or just personal preference?

Accusations of bias could be worsened by the fact that Facebook’s news feeds are lightly tailored. The trending feed also has some differences from what users see on their personal news feed, the Facebook spokesperson says.

Trending topics are generated through what users are talking about on the site, then “lightly curated” by Facebook’s review team, the company's spokesperson tells the Monitor.

“Popular topics are first surfaced by an algorithm, then audited by review team members to confirm that the topics are in fact trending news in the real world and not, for example, similar-sounding topics or misnomers,” writes Tom Stocky, Facebook’s vice president of search, in a post on the site on Monday.

Mr. Stocky also disputes a contention that the news curators artificially “injected” stories into the trending feed, including adding stories about the civil rights movement #BlackLivesMatter when they were not trending.

“Facebook does not allow or advise our reviewers to systematically discriminate against sources of any ideological origin and we've designed our tools to make that technically not feasible. At the same time, our reviewers' actions are logged and reviewed, and violating our guidelines is a fireable offense,” he writes.

Using data from more than 10 million users, researchers from the company found the site’s algorithm reduces so-called cross-cutting material – or content that runs counter to a user’s own political views – by slightly less than 1 percent. A user’s own “filter bubble” of friends, by contrast, reduces such content by about 4 percent.

“They’ve built a site that is profitable because it caters to people’s need to self-express and curate and refine their images and individual brands, and they do that within groups where they feel comfortable because everyone is like them. It’s the site for our time,” he says.

Additionally, some users are making conscious decisions to attempt to influence what types of content will appear in their own news feeds.

Several “folk theories” – including a “Narcissus Theory” that users will see more from friends similar to them and a perspective that suggests Facebook is all powerful and unknowable – shaped how some users manipulated the site, says Karrie Karahalios, an associate professor of computer science at the University of Illinois at Urbana-Champaign.

Dr. Karahalios and several colleagues collected these folk theories together in a recently published paper by giving users access to an interface disclosing “seams” that provided hints into how Facebook’s algorithm works.

“We found that it got people thinking a little bit more and it got them to try things on Facebook that they wouldn’t have thought of before, they had a bit more knowledge and they had a tool set available to them that they could put action into their news feed,” she says.

Editor's note: This article originally misstated the title of Jason Gainous at the University of Louisville.