In A Bubble? Change Your Friends!

Political scientists and social theorists have long fretted about the Internet’s potential to flatten and polarize democratic discourse. Exposure to news and opinion increasingly occurs through social media. How social networks choose to filter and personalize our news feed influence exposure to perspectives that cut across ideological lines. Can we trust the algorithms they use? Or are they hiding relevant information from us?

In the “The Filter Bubble”, Eli Parisier argued that news-filtering algorithms could significantly narrow what we know, surrounding us in information that tends to support what we already believe. On May 7 2015, Science published a study(1) by Facebook employees which puts part of the filter bubble theory to the test by examining what content you do (and don’t) see on Facebook’s news feed:

Our latest research, released today in Science, quantifies, for the first time, exactly how much individuals could be and are exposed to ideologically diverse news and information in social media. (“Exposure to Diverse Information on Facebook”)

The composition of our social networks is the most important factor affecting the mix of content encountered on social media with individual choice also playing a large role. News Feed ranking has a smaller impact on the diversity of information we see from the other side.

Backlash to the study happened overnight. Scientists have criticised that the study was conducted on a small, skewed subset of Facebook users who chose to self-identify their political affiliation, and authors are trying to minimize the impact of Facebook algorithms: