Archive (Page 1 of 2)

All they have to do is write to jour­nal­ists and ask ques­tions. And what they do is they ask a jour­nal­ist a ques­tion and be like, ​“What’s going on with this thing?” And jour­nal­ists, under pres­sure to find sto­ries to report, go look­ing around. They imme­di­ate­ly search some­thing in Google. And that becomes the tool of exploita­tion.

One of the things that I think is real­ly impor­tant is that we’re pay­ing atten­tion to how we might be able to recu­per­ate and recov­er from these kinds of prac­tices. So rather than think­ing of this as just a tem­po­rary kind of glitch, in fact I’m going to show you sev­er­al of these glitch­es and maybe we might see a pat­tern.

I’ve expe­ri­enced first hand the chal­lenges of try­ing to cor­rect mis­in­for­ma­tion, and in part my aca­d­e­m­ic research builds on that expe­ri­ence and tries under­stand why it was that so much of what we did at Spinsanity antag­o­nized even those peo­ple who were inter­est­ed enough to go to a fact-checking web site.

The ques­tion is what are we doing in the indus­try, or what is the machine learn­ing research com­mu­ni­ty doing, to com­bat instances of algo­rith­mic bias? So I think there is a cer­tain amount of good news, and it’s the good news that I want­ed to focus on in my talk today.

In 2011, the cul­tur­al crit­ic Emily Nussbaum reflect­ed on the flow­er­ing of online fem­i­nism through new pub­li­ca­tions, social media con­ver­sa­tions, and dig­i­tal orga­niz­ing. But Nussbaum wor­ried, even if you can expand the sup­ply of who’s writ­ing, will that actu­al­ly change the influ­ence of women’s voic­es in soci­ety? What if online fem­i­nism was just an echo cham­ber?

We have to ask who’s cre­at­ing this tech­nol­o­gy and who ben­e­fits from it. Who should have the right to col­lect and use infor­ma­tion about our faces and our bod­ies? What are the mech­a­nisms of con­trol? We have gov­ern­ment con­trol on the one hand, cap­i­tal­ism on the oth­er hand, and this murky grey zone between who’s build­ing the tech­nol­o­gy, who’s cap­tur­ing, and who’s ben­e­fit­ing from it.

We have increas­ing­ly smart, sur­veil­lant per­sua­sion archi­tec­tures. Architectures aimed at per­suad­ing us to do some­thing. At the moment it’s click­ing on an ad. And that seems like a waste. We’re just click­ing on an ad. You know. It’s kind of a waste of our ener­gy. But increas­ing­ly it is going to be per­suad­ing us to sup­port some­thing, to think of some­thing, to imag­ine some­thing.

It’s para­mount that our soci­ety rec­og­nize the role of anti-black struc­tur­al racism in the US. And that our 21st cen­tu­ry mul­tira­cial social move­ments uplift and cen­tral­ize the issues of those com­mu­ni­ty mem­bers who are impact­ed and are liv­ing at the mar­gins. We know that if we do, we’ll get clos­er to real jus­tice for all of us. Moreover, it’s been wide­ly doc­u­ment­ed that the gains made by and with the black com­mu­ni­ty have always led to bet­ter stan­dards of liv­ing for all of us.

I wouldn’t be sur­prised to find out that many of us here today like to see our work as a con­tin­u­a­tion of say the Tech Model Railroad Club or the Homebrew Computer Club, and cer­tain­ly the ter­mi­nol­o­gy and the val­ues of this con­fer­ence, like open source for exam­ple, have their roots in that era. As a con­se­quence it’s easy to inter­pret any crit­i­cism of the hack­er ethic—which is what I’m about to do—as a kind of assault.