Civil rights implications of Big Data

An excellent editorial by Alistair Croll on the civil rights implications of Big Data contains a number of points I hadn't considered before, as well as great analysis of the way that the Big Data situation arrived:

“Personalization” is another word for discrimination. We’re not discriminating if we tailor things to you based on what we know about you — right? That’s just better service.

In one case, American Express used purchase history to adjust credit limits based on where a customer shopped, despite his excellent credit limit:

Johnson says his jaw dropped when he read one of the reasons American Express gave for lowering his credit limit: “Other customers who have used their card at establishments where you recently shopped have a poor repayment history with American Express.”

We’re seeing the start of this slippery slope everywhere from tailored credit-card limits like this one to car insurance based on driver profiles. In this regard, big data is a civil rights issue, but it’s one that society in general is ill-equipped to deal with.

We’re great at using taste to predict things about people. OKcupid’s 2010 blog post “The Real Stuff White People Like” showed just how easily we can use information to guess at race. It’s a real eye-opener (and the guys who wrote it didn’t include everything they learned — some of it was a bit too controversial). They simply looked at the words one group used which others didn’t often use. The result was a list of “trigger” words for a particular race or gender.

19 Responses to “Civil rights implications of Big Data”

We live in a “post-privacy” society. We’ve known that for a little while now.
We also live in an unjust and oppressive society, where combinations of class, race, gender, nationality, etc. cause some people to be treated worse than others. We’ve known that for a long time now.

But the intersection of these observations is important. As they used to say “On the internet, nobody knows you’re a dog”…what will oppression and injustice look like in a society where any reasonably powerful institution knows everything about you everywhere all the time?

I don’t know; we live in a post-anonymity, but privacy is a contact. Privacy isn’t “no one knows,” privacy is “no one looks.” Like– when I take a shower, I’m naked in the shower. That isn’t a secret. You could just open the door. Even if I lock it, come on, a bobby pin can open that lock, it hardly counts. But you DON’T open the door, because…privacy! I think there is room for privacy in the modern world, even if anonymity is gone.

Interesting—”privacy isn’t what they know, it’s how they act on it” is certainly true in this world. It’s like in the village millennia ago: everyone knew you, and likely your secrets, through the thin walls of the hut. But if they didn’t act differently, it didn’t matter.

I was listening to NPR yesterday in Boston, and they were talking about Logan Airport’s screening practices, and involuntary discrimination. It’s something innate in humans to behave differently if we have subconscious biases.

But as Jonathan Haidt so eloquently explains in The Righteous Mind, our conscious brain is like a lawyer for our moral reasoning, grabbing hold of any cue or clue to defend our reactions after the fact. The issue here is that Big Data might give that internal lawyer a whole bunch of “case law”—seemingly just, reasonable, scientific explanations that are based on predictions about a person, but aren’t accurate.

Oh, I’m not trying to let anyone off the hook here– I think the case for institutional racism is clear & present on basically every strata of society– but rather pointing out what I hope is a path for reconciliation, at a broader level. I think people are culpable for their bias, even when it isn’t “on purpose,” yes. I think that privacy, at a top-down level, can be a tool used to alleviate that.

Good article. Progressive Insurance, linked in Croll’s article, already asks for one’s gender, so obviously this is just about race. I don’t begrudge certain businesses for engaging in risk management – provided it’s not capricious. The service I’d pay for is personal privacy management – as long as I don’t have to submit an application.

I’ve always done my best to avoid all that personalization crap. Yes, partly because I avoid giving out my info, but there’s another reason that is important to me. I like getting odd things. I don’t like some other person or computer deciding what I should be seeing. Surprise me! There’s a ton of really cool and wild stuff out there and I don’t want to miss it because some algorithm decides I might not like it.

I happened to click on an Indochino ad once. Now I see it every godamn place I go on the internet. I also happened once to look at security cameras after my neighbor had his house burglarized. Now google thinks I should security camera ads everywhere I go. this is making advertizing stupider and less effective.

The story is about a private financial institution adjusting someones credit limit. Amex has the right to use whatever data they have to adjust credit limits. Consumers have the right to not do business with them if their products are shitty. This is hardly a civil rights issue.

Re credit ratings, the current system is busted. The fact that you have to repeatedly borrow and return to become creditworthy encourages risky behavior and keeps many otherwise deserving people from getting access to credit (they can’t borrow in the first place or don’t want to often). If diverse data were rolled up to create credit ratings such that ratings were better reflections of risk, more people will be able to access housing and other products that require loans.