Researchers Want To Use AI To 'Predict' When Crimes Are Gang-Related

Jeffrey Brantingham, a University of California at Los Angeles anthropology professor and pioneer in the field of predictive policing, presented research earlier this year that uses a neural network to predict if crimes are gang-related. The ultimate goal, Brantingham's team writes in their paper, is "to automatically classify gang-related crimes where some crucial pieces of crime information are not currently available or are missing".

Titled "Partially Generative Neural Networks for Gang Crime Classification", the paper is the first from a research team Brantingham leads at the University of Southern California's Center for Artificial Intelligence and Society (CAIS). Bratingham's team is studying "Spatio-Temporal Game Theory & Real-Time Machine Learning for Adversarial Groups", with a focus on countering extremism. The research is federally funded by the US Department of Defence via the Minerva grant, which awarded it approximately $US1.2 million ($1.6 million) over three years.

As The Verge notes, once police have classified someone as a gang member or charged with gang violence, they face longer prison sentences, additional charges, and are barred from entering certain areas or associating with certain other people.

Advanced technologies are being used around the US to better address gang-related issue. Primary schools in New Mexico are using face recognition to bar suspected gang members from entry, for example, while in Chicago police use AI to "predict" someone's likelihood of either committing or being victim to gang-related gun violence. AI is also used to predict crime more generally, led by PredPol, a predictive policing company Brantingham co-founded.

To conduct their research, Brantingham's team fed data the Los Angeles Police Department collected between 2014 and 2016 into a neural network - AI that uses human brain-like processing for data classification. The AI is given police reports without certain qualitative data (which is the most time-intensive for police to complete), then generates the missing data itself. It uses the algorithmically generated report as part of its overall prediction of whether the crime was gang-related.

While more and more resources are being devoted to using AI to predict, prevent or classify gang violence, some in the field are pushing back against the practice. Christo Wilson, assistant professor in computer and information science at Northeastern University, notes to The Verge that the AI's predictions are only as good as the data used to train its predictions. Activists have long claimed the LAPD is overzealous in applying the gang classification. Thus, the AI may only reinforce these same biases.

"Now, maybe the LAPD is 100 per cent objective in their determinations of what is and is not gang-related," Wilson told The Verge. But if they are not, then the algorithm is going to reproduce their errors and biases."

While this particular research is still nascent, policing work has become increasingly automated. "Hot spot" policing uses computational methods to send police to geographic locations to anticipate crimes, changing simple patrolling. Licence plate readers provide a host of information on drivers, and police have looked into autonomous driving vehicles that ticket speeding drivers.

Like Brantingham's research, a focus in Axon's technology is reducing work for police officers. In a statement announcing its newly formed Ethics Board, Axon said its "ultimate goal in developing AI technology is to remove the need for police officers to do manual paperwork entirely".

Trending Stories Right Now

After a rocky start with the Pixel 1 (which remains one of the ugliest phones made this decade), a big—but still not fully realised — improvement on the Pixel 2, the Pixel 3 came out and finally made good on Google’s homegrown phone initiative.
And unlike phones from Samsung or Huawei, the Pixel 3 achieved this not by hitting users over the head with tons of cameras or far-out hardware, it did it in the most Google way possible: With nifty software, intuitive design, and AI-powered smarts.

Mark Rober really loves to build things. So when this home electronics tinkerer discovered that some neighbourhood thieves were ripping off Amazon packages from his porch, he did what any self-respecting former NASA engineer would do: He built a glitter bomb made to look like a boxed-up Apple HomePod, and he built it to capture video of the entire thing.