A Crystal Ball for Cops?

Depending on your perspective, “predictive policing” is the Holy Grail, or the dystopian nightmare of policing. The idea is to analyze data and come up with predictions on when and where crime is likely to occur… and who the perp will be. It isn’t based on individualized suspicion, it’s profiling pure and simple, but based on “science” and “data” rather than prejudice or bigotry. Either way, it’s unconstitutional. So when a $150,000 expense for PredPol software showed up in Oakland, California’s proposed budget, the Oakland Privacy Working Group sent an open letter to the City Council recommending that item be deleted. Not only is the software ineffective in reducing crime, the coalition argued, but it raises serious civil liberties concerns:

The act of sending police to a designated small area to watch for suspicious activity will inevitably lead to those police sent there being more suspicious than usual of everyone they encounter. This will lead to more “reasonable suspicion” stops which are in fact not reasonable, leading to civil rights violations, all the more problematic because hotspots are so likely to be in minority neighborhoods. Any attempt to mitigate this effect is akin to telling someone not to think of an elephant; it cannot be done, minds simply do not work that way. We also refer you to this research paper, Predictive Policing and Reasonable Suspicion, Andrew Ferguson, 2012 (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2050001), which concludes under current Fourth Amendment doctrine predictive policing will have a significant effect on reasonable suspicion analysis, a reality that necessitates a careful understanding of the technology.

Read more about Suspicious Activity Reporting here. The full letter is here:

AN OPEN LETTER TO THE MEMBERS OF THE OAKLAND CITY COUNCIL FROM THE OAKLAND PRIVACY WORKING GROUP. We are deeply concerned that the Oakland City Council is, as it was with the Domain Awareness Center, voting to allocate funds for a system that it has not debated, has not gotten public input on, is not what it is represented to be, and is troubling in its civil liberties implications. We are referring to the $150,000 line item for ‘Predictive Policing’ software that is included in the Mayor’s budget proposal. It should go without saying that the City Council should understand and debate this technology before it is purchased. The City Council should also consider the usefulness of this technology relative to other expenditures it might make to reduce future crime. There are many programs in need of funding or additional funding that are known to reduce future crime, including restorative justice programs, re-entry programs, jobs training programs, and redirection programs, to mention but some. As a general rule, social justice is likely a better investment than software for future crime prevention. Beyond this, we are extremely skeptical – as the City Council should be – about the claims made by the company selling this software that its system reduces crime. As reported in the East Bay Express, “There have been no independent analyses of PredPol’s software.” When the only people claiming a system works are those with a financial interest in it working, critical examination of their claims is particularly important. In two instances where PredPol has claimed significant reductions in crime, independent analysis of the data shows that PrePol seems to be “cherry picking” its start and end points. Darwin BondGraham, investigative reporter for the East Bay Express, created this graph of crime data in Santa Cruz: As he notes and as you can see, PredPol’s claim of crime reduction in Santa Cruz is based on an arbitrary selection of start and end months (in red). Using other start and end points (e.g., the green ones), equally distant, would result in far different conclusions. In fact independent analysis calculates that if one looks at the overall crime numbers before PredPol was instituted and compare then to the overall crime numbers after PredPol use began, the average crime rate for the two years before PredPol was installed is approximately the same (and in fact is slightly less!) than the average crime rate after PredPol began being used. In the case of Alhambra, similarly cherry-picked data points were used to support PredPol’s claims of reduced crime, again illustrated by BondGraham: Others have questioned this technology as well. A Forbes article, dated 2/11/15, on PredPol-like technology notes: “An independent RAND Corp. study of a non-PredPol predictive policing effort in Shreveport, La. found it had no effect on crime reduction.” And a research paper studying the effects of predictive crime algorithms concludes (translated from the French): “50% of crimes occur in 7.5% of the city… simply always predicting the same places “at risk” turns out to be as good as Predpol…The results obtained with retrospective analysis allow us to have serious doubts about the effectiveness of Predpol in real conditions.” On a more practical plane we note that, as the EBX reported, while Richmond, California has been using PredPol for a couple of years, the Chief of Police there wants to discontinue it. ‘[Chief] Magnus said… he isn’t convinced the software helped reduce crime. “We’re not going to continue it. Our plan going forward is to rely less on predictive policing and more on what we learn through our crime analysis process and through the beat officers’ familiarity with the areas they’re assigned.”‘ In a 2013 article in SF Weekly (http://www.sfweekly.com/sanfrancisco/all-tomorrows-crimes-the-future-of-policing-looks-a-lot-like-good-branding/Content?oid=2827968), criminologist Ed Schmidt is reported to have just completed a review of predictive policing efforts across 156 cities, and [he] says there is little actual data that predictive policing works. Beyond concerns of effectiveness, the City Council must also consider civil liberties and police practice concerns that would come to fore should PredPol be used. Basically, PredPol designates “hot spots” – small areas to which police units are sent to patrol. The act of sending police to a designated small area to watch for suspicious activity will inevitably lead to those police sent there being more suspicious than usual of everyone they encounter. This will lead to more “reasonable suspicion” stops which are in fact not reasonable, leading to civil rights violations, all the more problematic because hotspots are so likely to be in minority neighborhoods. Any attempt to mitigate this effect is akin to telling someone not to think of an elephant; it cannot be done, minds simply do not work that way. We also refer you to this research paper, Predictive Policing and Reasonable Suspicion, Andrew Ferguson, 2012 (http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2050001), which concludes under current Fourth Amendment doctrine predictive policing will have a significant effect on reasonable suspicion analysis, a reality that necessitates a careful understanding of the technology. For these reasons, the Oakland Privacy Working Group is strongly against adopting “PredPol” technology. We recommend that the City Council eliminate predictive policing software from the budget. We also strongly urge that should this item be considered at any time in the future, the City Council conduct a serious investigation into the effectiveness and civil liberties implications of this technology, and it must conduct a rigorous, open debate with public participation before deciding whether to move forward with this type of software. Thank you. Oakland Privacy Working Group 6/25/15