Policy Planner

February 11th, 2015

Practical obscurity and the right to be forgotten: “pretty much” privacy is enough

Rutgers University Law Professor Ellen Goodmanlooks at attempts to extend the right to be forgotten beyond Europe and argues that a person’s ‘right to be forgotten’ should not be a right to completely erase information, but instead should be a right to ‘practical obscurity’.

In 1979, the U.S. Supreme Court recognized an individual interest in the “practical obscurity” of certain personal information. The case was DOJ v. Reporters Committee for a Free Press. The Court held that publically available FBI rap sheets — maintained on tens of millions of people — could be withheld under FOIA (Freedom of Information Act) because, while they were not private, they were not easily available. The decision was decried by free speech advocates because it seemed to expand the privacy exemption to Freedom of Information disclosure. But the principle of the case today seems useful to both free speech and privacy advocates, and particularly useful in the Right to be Forgotten context.

In May 2014, the European Court of Justice ruled that Europeans have the right to request that Google and other search engines remove links to search results on their names that are not relevant, no longer relevant, inaccurate, or excessive. In responding to these requests, the search engine should balance individual privacy interests with the public interest in access to information. Google has at this point reviewed more than 750,000 links requested to be removed, and has removed close to 60%. But Google has only tinkered with searches conducted on European domains such as .uk, .fr, .de. The .com domain results remain unaltered.

The fear of EU over-reaching is that an aggrieved businessman in Milan can control what an interested citizen or journalist sees in New York. More than that, it’s the fear that EU over-reaching will spread to Pakistan and Russia, and that these countries’ strategies for suppressing information will be exported to the global Internet.

On the other hand, the EU is surely right that blocking links only on European domains is weak protection. The “disappeared” information is easily found a click away by searching through a .com domain. The question really is whether a person’s “right to be forgotten” is a right to completely erase information deemed unjustifiably harmful (by a private company without public oversight… but that’s a different issue).

Or is it the person’s right to return to the “practical obscurity” that existed before her reputation was defined by a Google search. 95% of Google searches conducted in Europe go through European domains. And a high proportion of the remaining 5% are conducted by travelers and ex-pats. So blocking access from European domains pretty much does the job. It protects the person who did something stupid or was associated with the wrong people from being “that guy” to the casual searcher. “Pretty much” is all that the right to be forgotten should protect. It should protect practical obscurity.

In another post, I’ll explore the implications of practical obscurity for the third-party doctrine and the evolution of privacy law. It provides a promising middle ground between private and public, allowing for a more context-sensitive treatment of personal information.

This post was originally posted on Medium.com and it is reposted here with the author’s permission. It gives the views of the author, and does not represent the position of the LSE Media Policy Project blog, nor of the London School of Economics and Politcal Science.

We use cookies on this site to understand how you use our content, and to give you the best browsing experience. To accept cookies, click continue. To find out more about cookies and change your preferences, visit our Cookie Policy.