How the Internet Reinforces Confirmation Bias

Recently I wrote about confirmation bias in connection with the climate change controversy — see my article at ThomasNet, “All This Wrangling Over Climate Change – What’s Up With That?” The Skeptic’s Dictionary refers to confirmation bias as “a type of selective thinking whereby one tends to notice and to look for what confirms one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs.”

Today I ran across an interesting TED Talk (TED hosts and posts video talks on innovative topics) by political activist Eli Pariser who has some interesting things to say about how the algorithms used on web sites such as Facebook and Google tend to reinforce our current thinking and filter out new ideas — see his talk, “Beware Online ‘Filter Bubbles‘” — well worth watching, only nine minutes.

Pariser explains what he means by a filter bubble:

Your filter bubble is kind of your own personal, unique universe of information that you live in online … the thing is, you don’t decide what gets in, and more importantly, you don’t actually see what gets edited out.

If you and I both search for the same thing at the same time on Google, for example, we get different results. The danger of the filter bubble, says Pariser, is that

this moves us very quickly toward a world in which the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.

He suggests that a personalization algorithm deciding what to show us needs to look not just at what it thinks is “relevant,” but at other factors too, such as those in this slide from his presentation:

This seems like a great insight. Anyway, I highly recommend this short video to get you thinking outside the box: