Navigation

An EGR 277 Project

Created by Rishi Kaneriya and Kylee Pierce

Advertisements

Relevant Social Groups

One way to get a better grasp on some of the socio-technical issues surrounding predictive policing is to consider the major stakeholders for whom these issues are, in fact, issues. The “Social Construction of Technology” (or, “SCOT”) framework, developed by Trevor Pinch and Wiebe Bijker, does just that. It looks at how a particular technology is shaped in its development through the interactions of relevant social groups and their flexible interpretations of the technology. Usually, this leads to some sort of closure in which one form of the technology dominates others.

Applying the SCOT framework to predictive policing, we can identify a number of important relevant social groups. Again, these are groups of people who not only are impacted by the technology, but who also actively shape its development:

It’s no surprise that police departments have a significant stake in the development of predictive policing technology. In general, law enforcement views it as a 21-century policing tool that has the potential to reduce crime rates and improve relations within communities. P. Jeffrey Brantingham, a law professor who helped develop PredPol, one of the branded technologies used in a bunch of police departments around the country, told The Guardian that police officers have always tried to map “crime hotspots” using data and analytics. He calls his software the “next iteration of that advancement.”

Commander Jonathan Lewin, who’s in charge of information technology for the Chicago PD, hopes that predictive policing will become a national best practice and stresses that, to him, it’s all about saving more lives (Verge). In fact, the Chicago PD is so committed to predictive policing that it actually uses historical crime information and other data to compile a “heat list” of suspicious persons who are most likely to engage in criminal activity.

Cynthia Rudin from MIT Sloan, co-author of “Learning to Detect Patterns of Crime”, extolls the benefits of the technology. In an article for Wired, she writes: “Machine learning can be a tremendous tool for crime pattern detection, and for predictive policing in general. If crime patterns are automatically identified, then the police can immediately try to stop them. Without such tools, it could take weeks or years of sifting through a database to discover a pattern, or it might be missed altogether.”

These are all optimistic accounts of how predictive policing technologies could be used by law enforcement to streamline its activities in an age rich with data and algorithmic efficiency.

To learn more about how predictive policing fits into existing infrastructures of policing, click here.

But police officers only form one side of the story. You can probably imagine that urban citizens might have different interpretations of the technology. Indeed, residents in cities like Santa Cruz, LA, and Chicago, where predictive policing technology is currently deployed, form another extremely relevant social group because they are the ones who are directly affected by the outcomes of the technology. They actually live in those little red boxes that PredPol directs officers to increase surveillance in. Of course citizens share law enforcement’s concern with crime; they, too, want crime rates to go down, but they seem to view predictive policing as a closed, opaque technology that has a lot of potential for abuse.

Andrew Ferguson, a law professor in Washington DC, fears that judges and juries could come to place too much credence in the accuracy of crime prediction tools (The Economist). A report by the RAND corporation supports this fear by pointing out that there are already more relaxed rules for reasonable suspicion in “high-crime areas.” Does reliance on PredPol give too much power to police officers and lead to abuses? Some fear that predictive policing creates an environment in which police can knock on anyone’s door because an algorithm told them it was okay. It changes the way reasonable suspicion has traditionally operated.

To explore these issues of privacy and intrusion in greater detail, click here.

(3) Software developers, lawmakers, etc.

While law enforcement and urban residents arguably form the two most prominent relevant social groups in the construction of predictive policing technology, there are other groups that are also worthwhile to mention. For example, the developers of technology like PredPol also interpret the technology, as do lawmakers and legal experts. Some of these interpretations overlap with those of urban residents and law enforcement, but others may stand on their own.

In order to keep our website focused and organized, we have chosen to mainly focus on the interactions between the former two social groups and their flexible interpretations of the technology. In doing so, however, engineering and legal interests will also emerge, particularly in relation to Big Data concerns and the Fourth Amendment’s reasonable suspicion clause.