Site Navigation

Site Mobile Navigation

Your Own Facts

“Just Google it!” has become a common cyber-snobbish response to questions that seem too trivial to merit a human conversation. But is it really an answer? Now that more and more Internet sites are tailoring their services to the idiosyncrasies of individual users, queries for “climate change,” “stem cells” and even “pizza” may yield different outcomes for different people. This may be an era when we are increasingly entitled to our own facts — but should we also be entitled to our own search results?

Google looks to your previous queries (and the clicks that follow) and refines its search results accordingly. If you click on gossip blogs like Gawker rather than Netflix after searching for the names of movie stars, links to Gawker may feature more prominently. Likewise, if you have hundreds of Facebook friends, you see relevant updates only from the closest of them; Facebook relies on your earlier interactions to predict what, and who, is most likely to interest you. Thus, if you’re a conservative who clicks on links only from other conservatives, you may never see updates from your liberal acquaintances, even if you do “friend” them.

Such selectivity may eventually trap us inside our own “information cocoons,” as the legal scholar Cass Sunstein put it in his 2001 book “Republic.com.” He posited that this could be one of the Internet’s most pernicious effects on the public sphere. “The Filter Bubble,” Eli Pariser’s important new inquiry into the dangers of excessive personalization, advances a similar argument. But while Sunstein worried that citizens would deliberately use technology to over-customize what they read, Pariser, the board president of the political advocacy group MoveOn.org, worries that technology companies are already silently doing this for us. As a result, he writes, “personalization filters serve up a kind of invisible autopropaganda, indoctrinating us with our own ideas, amplifying our desire for things that are familiar and leaving us oblivious to the dangers lurking in the dark territory of the unknown.”

Pariser wants companies to become more transparent about their filtering practices and to introduce more diversity into their search results and recommendations. (If Amazon thinks you’re a reader of crime novels, he wants it to more actively recommend choices from other genres.) Governments, he writes, ought to regulate the new information intermediaries proactively and ensure that people have full control over their data. And citizens, he argues in a somewhat romantic vein, should not be content as mere passive recipients of tweets, pokes and bytes; they should aspire to become what some Internet scholars call “information flâneurs,” treading the unbeaten paths in cyberspace and defying the narrow categories stealthily assigned to them by Web services.

Personalization is a cause for concern. For one, it is possible only because Web sites can gather huge amounts of information about their users, creating a Big Brother-like infrastructure for surveillance — a privacy nightmare. In authoritarian states, personalization may also increase censorship: the algorithms that determine what ads to show you might also guess what news not to show you.

Photo

Credit
Illustration by Ed Nacional

While Pariser does discuss the privacy dimensions of personalization, he is most concerned with its political and social implications, and particularly with what he believes to be its high toll on serendipitous discovery. Alas, he does not always treat this issue with the nuance it deserves. For all their sins, Google and Facebook do allow users to turn off most of their filters and return to the unpersonalized Web in a matter of seconds, something “The Filter Bubble” inexplicably doesn’t mention. Forcing Google to be more open about its algorithms — one of Pariser’s suggestions — may also hurt innovation in search; the company may be right to treat its algorithms as a trade secret. Scholars have been debating this issue, as well as the political ramifications of search engine bias, for more than a decade. Unfortunately, Pariser glosses over most of the relevant arguments from cyberlaw, information science and economics, relying on entertaining anecdotes from popular psychology instead.

Nor is it clear whether personalization will replace the unfiltered Web, or simply augment it. Will it be offset by innovations that facilitate exposure to unsought information? The advent of e-readers, for instance, has allowed people once again to experience the “completeness” of the full editions of newspapers and magazines, something that had been difficult to replicate in the realm of digital content. Personalization may also protect the ecology of cyberspace: when search results are tailored, the incentives to game the system and invest in practices like “search engine optimization” to push one’s products and ideas to the top of the universal ranking become weaker. To truly understand whether personalization is a threat or a blessing, we need a more holistic and dynamic account of the online landscape, something Pariser does not provide.

Most important, personalization’s effects on serendipity are far more ambiguous than “The Filter Bubble” suggests. Lacking a stable working definition of serendipity, Pariser sometimes equates it with randomness, sometimes with unexpected exposure to new ideas. But serendipity is a subjective concept that cannot be understood in isolation from the searcher’s own quirkiness and previous search history. By knowing which Web sites you like to visit and bookmark, a search engine might immediately point you to useful links that could otherwise get lost on Page 99 of unpersonalized search results. (In a 2009 study of search habits that tested this proposition, researchers for Microsoft found that “rather than harming serendipity, personalization appears to identify interesting results in addition to relevant ones.”) Building on Louis Pasteur’s observation that “chance favors the prepared mind,” one could see how personalization might augment serendipity by helping us maximize our own preparedness.

An error has occurred. Please try again later.

You are already subscribed to this email.

The book’s most provocative implications stem from Pariser’s utopian belief that Internet companies could, and should, be more than just information utilities facilitating search, communication and shopping. What if one day Google could urge us to stop obsessing over Lady Gaga’s videos and instead pay attention to Darfur? On closer examination, however, Pariser’s true concerns seem to be less about the sheer diversity of our information flows and more about the future of our political and cultural literacy. The absence of Lady Gaga links in Darfur-related searches does not seem to bother him as much as the opposite situation. But how can companies infer and promote such a pecking order of significance? Do we even want Google, Facebook and Amazon (or rather their algorithms) to direct us to pages they “think” we should visit to grow spiritually or intellectually?

Unlike such human filters as critics and editors, algorithms do not “think” — they compute. And while computing the “is” (i.e., relevance) is something they can accomplish, computing the “ought” (i.e., our information duties as citizens) is a much more contentious and value-laden process that is also made impossible by the limitations of artificial intelligence. This is not to deny that Silicon Valley engineers, as Pariser argues, have responsibilities that extend far beyond their job descriptions. But their modest quests to improve relevance, alleviate information overload and suggest books that may interest us — rather than to engage in algorithmic paternalism and assume a more critical social role — may be the lesser of two evils.

Although Pariser’s conclusions and prescriptions are not wholly convincing, he is to be commended for reinvigorating the conversation about the dangers of online personalization. And “The Filter Bubble” deserves praise for drawing attention to the growing power of information intermediaries whose rules, protocols, filters and motivations are not always visible. But whether we should demand more substantial civic commitments from these intermediaries is to be debated.

THE FILTER BUBBLE

What the Internet Is Hiding From You

By Eli Pariser

294 pp. The Penguin Press. $25.95.

Evgeny Morozov’s most recent book is “The Net Delusion: The Dark Side of Internet Freedom.”

A version of this review appears in print on June 12, 2011, on Page BR20 of the Sunday Book Review with the headline: Your Own Facts. Today's Paper|Subscribe