Technology has drastically shaped our society and our lives, with equal potential for both incredible good and devastating harm. Join us for a conversation with Brad Smith, President of Microsoft, to discuss his newly released book, Tools and Weapons: The Promise and the Peril of the Digital Age, in the context of policing technology, with special emphasis on the ethics of AI, privacy legislation and the need for regulation on facial recognition.

Face recognition as a technology has been the topic of much debate among both policymakers and AI practitioners recently. And justifiably so. Here, we present a conversation, in the form of questions and answers, between a policy analyst and a technologist.

In a report produced with the Policing Project, Axon’s AI and Policing Technology Ethics Board concluded that face recognition technology is not yet reliable enough to justify its use on body-worn cameras, and expressed particular concern regarding evidence of unequal and unreliable performance across races, ethnicities, genders and other identity groups.

Policing Project Director Barry Friedman moderated a panel at NYU Law exploring the use of emerging technologies like artificial intelligence, predictive analytics and face recognition in policing, and how we evaluate the true financial and social costs of this tech.

Friedman recently spoke with MuckRock regarding the move by safety tech company Axon to shift its production focus from its Taser stun guns to providing increased artificial intelligence services for police departments around the country.

Over the past year, employees at tech companies made headlines for publicly urging that their facial recognition work not be used for government surveillance—a phenomena that shows the unique ethical issues posed by this policing tech.

Last week, Policing Project Deputy Director Maria Ponomarenko participated on a panel at the Privacy Localism conference, hosted by the Information Law Institute at NYU Law. The panel, “Local Governance of Policing, Surveillance, and Data” highlighted the lack of public input into decisions about police use of surveillance technology, and discussed possible solutions.

Our website uses Google Analytics to help us understand website traffic and webpage usage as described in Google's Privacy Policy. You can opt out of Google Analytics if you disable or refuse the cookie, disable JavaScript, or use the opt-out service provided by Google. Google Analytics does not identify individual users or associate your IP address with any other data held by Google. The Policing Project does not share or sell data to third-parties.