I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at khill@forbes.com. PGP key here.
These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school.
In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation.
I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.

Now that the European Union’s member states are flailing around attempting to implement their miserable cookie directive, the European Commission has decided it’s a good time to further retard the Internet.

On Wednesday the European Commission released an already-leaked new version of the Data Protection Directive which firmly establishes a European right to data erasure, or “right to be forgotten.” Article 17 will give EU residents an unprecedented inalienable right to control and delete facts that were once voluntarily communicated by the subject. Moreover, the right to erasure covers all publications of the personal information. As the preamble explains:

To strengthen the ‘right to be forgotten’ in the online environment, the right to erasure should also be extended in such a way that any publicly available copies or replications in websites and search engines should also be deleted by the controller who has made the information public.

The European Commission’s Vice President for Justice has clarified that the data deletion rule applies even to information that the data subject communicates herself on a public web forum like Facebook.

The right to be forgotten supposedly has some limits:

However, the further retention of the data should be allowed where it is necessary for historical, statistical and scientific research purposes, for exercising the right of freedom of expression, when required by law, or where there is a reason to restrict the processing of the data instead of erasing them.

But this exception is undermined both by the necessity language (when, exactly, is a single factum “necessary” for history or expression?) and by the downright draconian fines that are imposed for noncompliance. Article 79 instructs EU authorities to collect 1% of an enterprise’s annual revenue in fines for failure to comply with the right to be forgotten. Violation of other data rules could lead to fines of up to a million euros or 2% of a company’s global revenue.

I am disappointed, but not surprised, to see the EU continue a misguided attack on the information economy. The right to be forgotten unequivocally favors the interests of the data subject, no matter how selfishly motivated, over the interests of data controllers and other consumers. Moreover, by making the right of erasure inalienable, the EU prevents its own citizens from participating in a business model that allows consumers to trade their information for stuff they want—convenience, discounts, or content. EU residents have no unencumbered information to sell.

The popular understanding in U.S. privacy discourse is that the EU does a better job protecting consumers from big corporations than the U.S. On the surface this looks right—after all, the regulations speak almost exclusively of “rights” granted to consumers, and almost exclusively of “obligations” imposed on business entities. But on closer inspection, Europe’s approach to information privacy is more emblematic of a desire to hold technology fixed, as if the amount of information people had about one another just before the advent of the Internet was the right amount for some reason. Sooner or later, the policies motivating the EU Data Protection Directive will prove to be counter-productive and regressive. The negative right against automated processing is a good example:

Every natural person should have the right not to be subject to a measure which is based on profiling by means of automated processing.

Article 20 gives every EU resident has an absolute right to stop a company from using predictive analytics concerning employment, creditworthiness, or health decisions. (These examples come straight from the text of the new regulations.) This sounds like a good deal for data subjects until one thinks for a moment about who the local losers would be. There are two possibilities. The opt-out might create a market for lemons where the person’s decision to opt out serves as a reliable signal of some sort of problem. This quite obviously cuts against the goals of the opt-out right. Alternatively, opt-outs will simply muddy the predictive models for everybody. Credit, for example, will be extended to applicants who are slightly more likely to default. The lower income applicants who might have looked more were creditworthy in comparison will pay higher interest rates, if credit is extended at all. Hooray.

Allegedly, one of the motivations for amending the directive is to make consumers feel more comfortable with e-commerce.

Lack of trust makes consumers hesitate to buy online and adopt new services. This risks slowing down the development of innovative uses of new technologies. Personal data protection therefore plays a central role in the Digital Agenda for Europe.

Complete and utter hogwash. And it’s old hogwash, too. Consumer mistrust and timidity has been trotted out as a threat to e-commerce for as long as there have been public opinion surveys about the Internet. The theory refuses to die despite ample evidence to the contrary. Now I am not suggesting that everything consumers do is per se good for them; law can and should occasionally force producers and service providers to take precautions that consumers would not choose to pay for if they could help it. Information asymmetries and optimism bias are among the justifications. But the claim that consumers are shying away from Internet commerce and services does not comport with actual consumer behavior.

Google and other major Internet companies might want to start coordinating a protest similar to the effective campaign we saw here in the states in response to SOPA. If Google makes every person with the first name “John” ungoogleable for a day, and if online retailers refuse to access cookie data for a day, and if content providers double the amount of advertising for a day, pressure can build before the Directive comes to a vote.

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

This is the worst article I’ve read in a long time. Mrs. Yakowitz clearly doesn’t understand European Social dynamics. She might not be alone – just remember what happened to Walmart, sucessful everywhere, failure in Europe – Walmart too failed to understand how Europeans and European markets function. If new Internet technologies such as Cloud Computing, Online Collaboration, etc are to expand, then Data Protection Laws are to be fixed to create confidence among end users.