Chapter 3:Criminal Justice

Predictive Policing: From Neighborhoods to Individuals

There is no public, comprehensive description of the algorithm’s input.

In February 2014, the Chicago Police Department (CPD) made national headlines for sending its officers to make personal visits to residents considered most likely to be involved in a violent crime. The selected individuals were not necessarily under investigation, but had histories that implied that they were among the city’s residents most likely to be either a victim or perpetrator of violence. The officers’ visits were guided in part by a computer-generated “Heat List”: the result of an algorithm that attempts to predict involvement in violent crime. City officials have described some of the inputs used in this calculation—it includes some types of arrest records, for example—but there is no public, comprehensive description of the algorithm’s input.

The visits were part of a new “Custom Notification Program,” which sends police (or sometimes mails letters) to peoples’ homes to offer social services and a tailored warning.[52] For example, officers might offer information about a job training program or inform a person that federal law provides heightened sentences for people with certain prior felonies.[53] The city reports that the contents of a notification letter are based on an analysis of “prior arrests, impact of known associates, and potential sentencing outcomes for future criminal acts.”[54] Although some of these visits have been poorly received,[55] the department argues that the outreach efforts may already have deterred crime.[56] Mayor Emanuel recently claimed that, of the 60 interventions that have already taken place, “none of the notified individuals have been involved in any new felony arrests.”[57]

The Heat List is a rank-order list of people judged most likely to be involved in a violent crime, and is among the factors used to single people out for these new notifications. The CPD reports that the heat list is “based on empirical data compared with known associates of the identified person.”[58] However, little is known about what factors put people on the heat list, and a FOIA request to see the names on the list was denied on the grounds that the information could “endanger the life or physical safety of law enforcement personnel or [some] other person.”[59] Media outlets have reported that various types of data are used to generate the list, including arrests, warrants, parole status, weapons and drug-related charges, acquaintances’ records, having been a victim of a shooting or having known a victim,[60] prison records, open court cases, and victims’ social networks.[61] The program’s designer, Illinois Institute of Technology (IIT) Professor Miles Wernick, has denied that the “algorithm uses ‘any racial, neighborhood, or other such information’ in compiling the list.”[62]

Cities across the country are expanding their use of data in law enforcement. The most common applications of predictive technology are to assist in parole board decisions[63] and to create heat maps of the most likely locations of future criminal activity in order to more effectively distribute police manpower. Such systems have proven highly effective in reducing crime, but they may also create an echo chamber effect as crimes in heavily policed areas are more likely to be detected than the same offenses committed elsewhere. This effect may lead to statistics that overstate the concentration of crime, which can in turn bias allocations of future resources.

Chicago’s experiment is one of several of a new type, in which police departments move beyond traditional geographic “crime mapping” to instead map the relationships among city residents. Specifically, identifying individuals for tailored intervention is the trend most likely to expand in the future of predictive policing—raising important questions on how to ensure justice continues to be protected through machine systems. Other districts are already working with academics to develop similarly styled programs, including one in Maryland that aims to “predict which of the families known to social services are likely to inflict the worst abuses on their children.”[64] In projects like these, automated predictions of future bad behavior may arise—and may be acted upon—even without direct evidence of wrongdoing. Such systems will sometimes make inaccurate predictions, and when they do, their mistakes may create unjustified guilt-by-association, which has historically been anathema to our justice system.

Even as they expand their efforts to collect data, city governments often do not have the academic resources to analyze the vast amounts of data they are aggregating. They are often partnering with private or academic institutions to assist in the process. In Chicago, the city is working with the MacArthur-backed Crime Lab to analyze the effectiveness of various programs, including things like “Becoming A Man,” a program that focuses on violence prevention among at-risk youth.[65] These partnerships allow the city to expand the ways it uses the data it collects, and may unlock significant benefits (by, for example, demonstrating the effectiveness of non-punitive crime reduction programs). At the same time, the private actors conducting these and other analyses should be held to at least the same standards of accountability and transparency that would apply if the city were analyzing its data internally.

[54] Chicago Police Department, supra note 52. (“The Custom Notification is predicated upon national research that concluded certain actions and associations within an individual’s environment are a precursor to certain outcomes should the individual decide to or continue to engage in criminal behavior.”)