Saturday, May 28, 2011

Eli Pariser's Filter Bubble Problem

In his talk (see video below), Pariser discusses how increasingly the major services that provide information to us on the internet are invisibly, algorithmically filtering and shaping the content we see. Google, for example, allegedly uses 57 signals to personally tailor your search results -- even if you aren't logged in. Have you ever noticed how Facebook, by default, only shows activity in your News Feed from friends you interact with most?

Personalization wraps us in an isolating bubble of information specifically aligned to our tastes. It sounds logical, even appealing, but we risk losing serendipitous exposure to new perspectives and ideas that challenge us. We never get to see what gets filtered out. We wind up viewing only what (the algorithms conclude) we want to see, not necessarily what we need to see.

Algorithmic curation needs to be transparent enough, Pariser argues, so that we can consciously exert a measure of control over the filtering process. Mere relevance is insufficient. A good, rich flow of information introduces us to uncomfortable ideas, and new people -- and is critical for democracy.

What do you think about the Filter Bubble? Are you content with what you see on the internet? As we grapple to deal with the unfathomably large sets of data and information streams generated by our modern age, some degree of filtering is essential for us to make sense of it all. This ongoing tension will prove critical in terms of shaping how we see, understand, and interact with the world at large.