Can Facial Recognition Stop School Shooters?

July 18, 2018

With each new school shooting, the debate around gun control typically resurfaces and subsequently goes nowhere. This has led many to look for solutions in other places. For many, that answer lies in tech, a sector that chronically believes it can help, even if help isn’t asked for. While some efforts seem promising, one solution has simultaneously grown in popularity while coming under fire from privacy rights activists and parents. Facial recognition software—along with a slew of cameras—was purchased for the sum of $1.4 million by the Lockport City School District in New York. The Magnolia School District in Arkansas made a similar move earlier this year, as did an after school center in Bloomington, IN, the city of Springfield, MA, the University of Calgary, and many others. And this week, the founder of RealNetworks, Rob Glaser, announced that he had developed facial recognition software which he will make available to schools for free. There’s only one issue: many say it has no power to stop school shooters.

Stopping School Shooters with Facial Recognition

Advocates and face recognition software vendors say that it can detect the likenesses of criminals, gang members, blacklisted community members, sex-offenders, expelled students, non-custodial parents, and others. Once detected, the system can alert administrators and local law enforcement of their presence. Many smart recognition software products are also programmed to detect guns.

Here’s the problem: almost every single school shooter since 1998 has been a matriculating student. And when they walked into school, they tended to have their weapon concealed. As Rachel Levin-Waldman, security and policing expert at the Brennan Center for Justice, told Wired, “These are students for whom the school wouldn’t have a reason to have their face entered into the face recognition system’s blacklist.”

Steve Harvey, Unsplash

A Quagmire of Concerns

And for all its merits—or lack thereof—facial recognition software comes with some serious tradeoffs. The expensive price tags notwithstanding, scaling this solution throughout entire school districts presents a security and privacy nightmare.

Shortly after the Magnolia School District announced their plans this spring, the local chapter of the ACLU quickly responded with concern.

“All of us want schools to be safe, but subjecting students to an unproven, costly, and intrusive biometric surveillance system is not the answer,” said Rita Sklar, ACLU of Arkansas executive director, in a statement. “These kinds of facial recognition systems are also vulnerable to hacking and abuse – compromising students’ privacy and diverting money away from other pressing educational needs. Communities and schools need to think hard about what type message they are sending to our kids when they monitor them in school like they were prisoners in a detention facility. We urge the Magnolia School Board, and all Arkansas school districts, to avoid these expensive, harmful gimmicks and consider more sensible approaches to keeping schools safe.”

And besides concerns over efficacy and student privacy, some believe that these facial recognition software companies are simply capitalizing on parents’, teachers’, and administrators’ fears.

As law professor at the University of the District of Columbia, Andrew Ferguson, told the Washington Post, “These companies are taking advantage of the genuine fear and almost impotence of parents who want to protect their kids, and they’re selling them surveillance technology at a cost that will do very little to protect them.”

In a separate interview with the Intercept, Ferguson described how these efforts are lacking in perspective.

“In a time when we cannot afford to pay our teachers a decent wage,” he said, “I cannot fathom any school district paying money for this type of security theater.”