Dotting the "i"s in Identity

Monthly Archives: February 2013

Yesterday’s NYT article on US/EU consumer data protection, interviewing Commissioner Viviane Reding, has sparked interesting comment and debate, so – with thanks to @omertene for the link – and since the US and the Eurozone at least share a currency unit, here’s my 2 cents:

Richard Thomas (former Information Commissioner in the UK) described the trans-Atlantic privacy perspective as being based on feelings of “suspicion, ignorance and superiority”… before clarifying, of course, that this was true regardless of which side you started from. However, even if the feelings are similar and mutual, the reasons which underlie them are frequently asymmetric, occasionally fundamental, and possibly in some cases irreconcilable. And if you think that’s a sweeping generalisation, let me warn you that there are more to come – shot through with my opinions and questionable inferences, yes, but mostly with some basis in experience, if not objective fact.

I want to look at two main areas: the role of “rights” in privacy regulation, and the role of commercial interest.

The role of privacy

At its core, the EU’s privacy regime is based on the concept of a fundamental right to the preservation and integrity of privacy – or, as linguistic differences sometimes oblige us to put it – the quiet enjoyment of private and family life, or respect for the ‘private sphere’. The EU’s assertion that privacy is a ‘fundamental right’ often provokes a reaction of some mistrust in US counterparts I’ve spoken to. In their view there’s something suspiciously… well, socialist about it, frankly. It’s a gut feeling of unease, more than anything else; a instinct that this insistence on “rights” is a bit hippy, and rather too likely to lead to people abjuring all kinds of responsibilities too.

The corresponding EU perspective is often to be rather offended by this disparaging view of “rights” – after all, aren’t they a rather noble social aspiration? A sign of civilised progress up Maslow’s hierarchy and away from the base instincts of the market?

And here’s one of the fundamental splits. The EU, for its part, does place a lot of faith in “rights” as the basis for laws that have to apply across a diverse set of cultures, legal traditions and social norms. The US – at least from an EU perspective – puts its faith in the market and in the ability of everyman to sue. Right or wrong, the impression is that in the EU, your right to redress arises out of violation of a right, whereas in the US it arises out of a tort.

The primacy of commerce

There is also a trans-Atlantic tension over how much privilege to give to commercial interests. Again, at risk of sweeping generalisation, here are the two high-level perspectives: the EU views the US approach as prepared to write off almost any privacy-related behaviour, provided it stimulates economic activity. Commercial interest trumps all, and the use of personal data for commercial ends is, in principle, a Good Thing. This is reflected in various characterisations of personal data as a monetisable commodity, or even “the new oil”.

The idea that the commercial value of personal data is its only relevant value, though, generates unease in the European psyche. In particular, it underlies their mistrust of ‘voluntary codes of conduct’ as the sole or principal constraints on privacy-related commercial activity. Commercial data processors, they argue, simply have too many irresistible incentives to ignore such self-regulation. If the only risk arising from commercial data exploitation falls on the data subject, then only regulation can keep data processors honest: market forces won’t do the job.

Conversely, US counterparts often see the EU as insisting too much on principle, whether or not it does any good – and with the implication that it often does harm. In particular, attempts to constrain commercial data use through privacy principles are seen as a brake on innovation and thus on economic activity. There are two European ripostes to this argument:

First, they may say, regulatory constraints do not prevent the exercise of commercial innovation – indeed, ingenuity thrives on constraint, and can be relied on to find its way round obstacles in pursuit of a commercial goal.

Second, the correct way to regulate innovation is to define, pre-emptively, the principles it must respect, and then let it run. Waiting until after the event and then relying on the courts to put the privacy toothpaste back in the tube is, in this hyper-connected age, to take irresponsible liberties with the interests of the individual and consumer.

The US counter-blast is that economic activity is the interest of the individual and consumer, and that fine and fancy privacy constraints are an unaffordable luxury if your economy just isn’t cutting it in the global market.

Again, I suspect this is an EU-US fissure that runs deep and can be bridged or papered over, but probably not closed.

In the interests of (relative) brevity, I have not touched on a couple of other significant areas: (i) the “homogeneity” of EU privacy regulation, and why that is often over-stated; (ii) the implications of relying on “harm” as a metric for privacy redress. The hope is that this post will be the first in a short series, and that we will also get informed comment from US, Continental European and Asia-Pacific contributors… And with that, over to you…

Well, the annual Data Privacy Day (Jan 28th) has been and gone: how was it for you? Did you follow through on your brief flurry of privacy-related good intentions? Delete your Facebook account? Update your browser privacy plug-ins? (By all means, read these questions in ascending, Stewie Griffin style if it helps set the mood…). “No? Well… you’ve been knocking yourself out… you deserve a break”. ;^)

So, in my supportive way, here’s something much simpler you can do in a few easy, 5-minute bursts. My Internet Society colleagues have launched some online tutorials to help people understand and manage their online privacy, and I hope you will help spread the word. But first, a short digression about why we’re doing this.

For a while now, I’ve been grumbling about the phrase “personally identifiable information”. It has been a useful expression, but I think it has outlived its usefulness, and is now actually constraining our thinking about how to achieve good privacy outcomes for data subjects.

We need to move away from the idea that privacy is achieved by managing lists of the pieces of data that count as “personally identifiable” and ignoring the pieces that aren’t on the list. Privacy is more subtle and contextual than that, and can be impacted by data that is not currently on anyone’s list of PII.

I want to start re-defining PII as “privacy-impacting information”.

If the explosion of social, mobile and cloud-based services has taught us anything, it is that there is money to be made out of ‘big data’, especially when it can be mined for information about individuals’ behaviour, preferences and aspirations. So, if you thought your behaviour, preferences and aspirations were none of anyone else’s business, think again – they are, literally, someone else’s business.

The information in the “information economy” is… us.

We are being bought, sold, and traded in an economy whose workings are almost entirely opaque. Every time we go online, we add to a personal digital footprint that’s interconnected across multiple service providers, and we enrich massive caches of personal data that identify us, whether we have explicitly authenticated or not. Your digital footprint is invisible to you – and it’s really hard to manage something you can’t even see.

That may make you feel somewhat uneasy.

So, here are some simple and realistic privacy steps for all of us. Try them – you’ll feel better:

First, let’s revisit our assumptions about the online “bargain”

Online transactions (whether retail or social) are seldom a two-party affair these days. Who else is in the transaction chain? Does a social networking service see all your private messages to your buddies? Is a retailer selling your purchase history to advertisers?

There’s no such thing as a free service. More often than not, we pay by giving up information about ourselves, without appreciating its value. Understanding the bargain is key.

Often these costs fundamentally change our online experience. Yes, you may get personalised recommendations – but are you also being offered (or denied) services because of data the service provider passed on to a third party? Are you being offered higher prices because of the brand of laptop you use?

Second, let’s take time to reflect on our privacy values

Our behaviour is driven by our values. When we value convenience over privacy, we set our priorities accordingly. Until we adjust the value we place on privacy, the steps we can take to preserve it will continue to seem like an inconvenience to be put off until later.

Third, let’s take some small practical steps in the right direction

The first step towards protecting our digital footprint is to learn more about it.

The Internet Society has developed three interactive tutorials to help everyone learn more about their digital footprint. Each lasts about 5 minutes and is aimed at helping all of us become more aware of how we disclose information and how we can keep it more private. Please take a look, and forward them to people you think would find them useful.

After all, if we are the currency of the new economy, shouldn’t we have a say in what we’re worth?

Post navigation

Search for:

Please note:

This blog contains a mixture of "personal" and "work-related" posts, if you choose to make that distinction. None of the opinions expressed should be taken to represent either the views or policies of my employer.