In-progress

The Personal Data Conundrum

For a long time, if you had asked me what one thing I would change about most people’s relationships to their data, my answer would have been awareness.

What I meant by that was simple: I wanted people to know more about the roles data plays in their lives and in the world.

In my mind, that meant loads of things: understanding that digital footprints have memories and lifelines, and that access to them brings the ability to infer things about people and relationships. It included knowing that companies like Facebook can make money off your data even while you still technically own all your content; understanding that data isn’t truth; knowing that photos can and are photoshopped (even by institutions like governments); believing that the Department of Homeland Security has and does track people who haven’t done anything wrong (though now it has greater means and a longer reach to do so).

It also meant knowing that personal data is the valuable currency that many corporations deal in. It involved accepting the fact that the things you knowingly and unknowingly produce have the ability to reach dizzying levels of dissemination, and the potential to live longer lives than you.

Banksy's Dismaland included a gigantic chart of surveillance in the UK that was both impressive and overwhelming.

That’s just a (small) sample of the sorts of things that I think everyone should know. My thought process was that if everyone knew and understand the data landscape — not just data literacy, but the whole picture— the potential for our data to be used against us would be significantly minimized.

Obviously, that was a rosy, idealized point of view. And lately I haven’t been feeling the same way.

Lately I’ve been getting stuck on something I like to call the Personal Data Conundrum. Now that you know that you leave data trails everywhere, what are you to do with that information? So you know that companies make money off of you. Now what? Your data is valuable in conjunction with others’; for most of us, it carries less value when considered individually. So opting out as one person doesn’t necessarily change the system itself. And a company like AT&T may be handing over data to the US government, but it’s not like switching to T-Mobile solves the problem. Either way, your data is still out of your hands (and now your cell phone service is even spottier).

Lots of activists, artists, and privacy enthusiasts have one answer: that you should protect yourself. Use Tor! Download Telegram! Get a VPN! Use masked email addresses! But those are all individual responses to a systemic issue, and in situations with an unequally distributed playing field, it’s inevitable that only those with the time, resources, and interest will adopt those measures. And even if that wasn’t the case, the larger problems remain. Should you quit using a cell phone? Stop accessing public wifi? Refuse to use Google Maps, because even though it gives you directions it also combines your location data with the other hundreds of datapoints Google has on you? Our very ways of communicating are so ingrained in these compromised systems. For a lot of people, opting out of being a data machine means opting out of feeling like you’re fully participating in society.

So now my sense is that though awareness is important, demanding (or even hoping for) everyone to be completely knowledgeable about and connected to their personal data is shortsighted and overly-optimistic. Ignorance and indifference are completely reasonable responses to the realization of your lack of agency. Some people may think that if there’s nothing they can do about a situation, they’d rather not engage at all—-and I can’t even blame them for that line of thought.

So there’s the dilemma. I think it’s important for people to understand and think about their data, but once they do that, they end up directly confronting problems in efficacy that I don’t have practical answers for solving.