Why data privacy is a human right: in conversation with Aurélie Pols

Aurélie Pols’ professional mantra is “Data is the New Electricity, Privacy is the New Green, Trust is the New Currency.” One of the pioneers of data analysis in Europe, Aurélie was recognized by the Digital Analytics Association for Most Influential Industry Contributor for her work in Data Privacy. She leads her own consultancy, Competing on Privacy, serves as DPO for NY based CDP mParticle and is part of the European Data Protection Supervisor’s Ethics Advisory Group.

We spoke with Aurélie about her thoughts on digital identity in 2018.

FWD50: How would you define digital identity today?

Aurélie: A dangerous mismatch between our real personalities and a potential reflection of our selfhood that can bring about problems. We need to address that before it gets worse. That is kind of how I see it for the moment.

FWD50: So it’s dystopian in a sense?

Aurélie: It’s partially dystopian; it’s partially positive as well. It’s about this intermediary state that we’re in today. We need to make sure our accountability obligations of the different actors within the system are also taken into consideration. For the moment there are still a lot of companies optimizing for themselves, but not the ecosystems.

One of the problems is: most of the systems take into consideration typical use cases but not outliers. It happens where there are very specific cases that weren’t imagined. We need to get savvier and understand these specific use cases, address them as fast as possible, and integrate them into our systems.

The French like to say “I am not my data” because they understand, through European history, we’ve had some interesting issues with data. They are very careful to say yes digital identities reflect parts of me, but can not manage my life. We should be very careful about that.

FWD50: We started having a digital identity before there were any kinds of policies or big picture understandings about the implications.

Aurélie: We’re still learning about the implications. What are the consequences of technology on our lives? How is this changing our values, our ways of living, ecology, etc? There are so many things around us that are being influenced by technology today.

We need to discuss the philosophy behind this so that we know what we want to optimize as a society — for the economy overall, not just one actor serving a specific individual with a specific technology, but the whole data ecosystem. We need to broaden the scope. This means we need collaboration and that collaboration does not currently exist.

FWD50: A lot of people who are collecting data are doing so for their own needs and purposes. They are not necessarily thinking about how there’s more than a business need behind this.

Aurélie: Yes, absolutely. Which also means that there’s a lack of standards. To be honest, if we’re talking about a European perspective — we are also to blame. For example, the European Commission started asking questions at the beginning of this year in terms of certifications and standards. You’re like: “Hey the GDPR is in five months!” but we’ve been asking these questions for the past three years.

It’s a bit the same idea as of how accounting for financial information now exists. When you pass money from one company to another it falls under the logic of accounting systems and this is something that I think should be built for data as well. so that we can go back and say: “here’s the problem and we can fix it there”. Then it can be replicated through the entire data ecosystem. We can then make sure that we learn lessons from something that goes wrong.

FWD50: There needs to be a certain amount of integrity that’s lacking right now. A lot of organizations are not really looking out for the human beings that are giving up their data. How do you give a corporation that kind of integrity?

Aurélie: This is obviously then a risk exercise per company. I worked for the European Data Protection Supervisor on a white paper on digital ethics. We compared this more Anglo-Saxon idea of “move fast & break things” with a slogan and adage called “Festina lente” (make haste slowly). We have to make haste. This is what I’m seeing today. Companies move very fast, do scary things and reap short-term benefits, but then crawl down because they understand that it’s not sustainable. Companies that go slower, but build more carefully make sure they go in the right direction.

Which one of the two are you? How you position yourself depends on your company ethics but also your financing. Financing is what usually breeds the ethos of a company. Money still runs a lot of the world and those are some of the consequences. It has effects on data, because technology is there, and then it will have consequences on our lives or digital identities. So a lot of challenges.

What are your hopes for the future of data integrity and personal digital identity?

Aurélie: I look forward to more understanding by individuals both in terms of data flows and information. I hope the mechanisms put in place by legislation like the GDPR will allow companies in the data sphere to be more aware and have further discussions about what it means to be compliant and what it means to be ethical, moving away from “let’s collect everything and see what we’re doing with it” toward “let’s collect what we need because we need to justify it”.

It’s not data management. It’s making sure that the information we have is used in a way that is beneficial to everybody and not just the company. I love data. I believe in data. I think it should drive a lot of things, with respectful limits. I think that government has a role to play in there because it’s a societal issue; it’s not just about data optimization and things like that. What worries me the most is: if we don’t get this thing right what are the consequences for the next generations? We can already feel it.

FWD50: Absolutely. Who is doing it right in 2018, who is the model?

Aurélie: Ha. Who do you trust? I think the model is not there yet. I think we need only to invent it and to work together. We shouldn’t also trust blindly. I wouldn’t say that I would trust one company or one entity or one government more than another. It’s a complex problem. I would tend to say that I trust Facebook more than Google.

FWD50: Oh, that’s really interesting.

Aurélie: Because of shareholder structure. They don’t have the same incentives. Whether they get it right or not is something else, but my money would be on Facebook. But things like differential privacy, you can’t audit that. It’s interesting because they promise to do the right thing. But because they put noise into the data flows, how do you make sure? You can’t. There’s a problem of checks and balances there. I like the theory but the practice, I can’t verify it.

What I usually pass on to data scientists is: don’t read the GDPR or the California Privacy Protection Act, but read The Charter of Fundamental Rights of the European Union, which talks about discrimination, no torture, the right to dignity… That should drive digital identities — not some specific law in terms of compliance. It’s about the high-level human rights of individuals that determine how we build in the right direction to build a society that we want. I think that’s a good baseline for data identity.