Event overview

Machine learning algorithms in public policy have become the latest and fastest growing iteration of instrumental reason for governmentality. While the Enlightenment aimed to dispel myths via the pursuit of truth through reason, instrumental reason produced myths in place of myths (Horkheimer and Adorno, 2002/1944). More importantly, Enlightenment reason was formed by the racial logics of colonialism whereby the human subject of the Enlightenment was constituted as European Christian male heterosexual and able bodied, rendering all others as inferior, primitive, partially human or nonhuman (Cesaire, 1955; Chakrabarty, 2000; da Silva, 2007; Fanon, 1967/1952; Weheliye, 2014; Winter, 2007). Such logics of discrimination and racial hierarchy have also been traced to the advent of the political project of democracy and its materiality in Western nation-states (Hanchard, 2018). What are the ways in which the racial logics of colonial historicity may be haunting the instrumental reason of algorithmic governmentality? By engaging (materialist) Derridean hauntology (Barad, 2012; Derrida, 1994), this paper examines the ways in which the sociotechnical assemblages of algorithmic governmentality may become heir to an instrumental reason that hierarchizes and differentiates bodies, demarcating what bodies have the capacities to regenerate, designating others as nonhuman and debilitated. Examples will be provided from case studies on predictive policing and learning analytics in the United States.

Ezekiel Dixon-Román is an Associate Professor in the School of Social Policy & Practice at the University of Pennsylvania. His work is on the cultural studies of quantification and critical theories of difference. He is the author ofInheriting Possibility: Social Reproduction and Quantification in Education (2017, University of Minnesota Press) and is currently working on a book project tentatively titled Haunting Algorithms: Algorithmic Governmentality and the Racial Logics of Colonialism.