Red Flag On Biometrics: Iris Scanners Can Be Tricked

Red Flag On Biometrics: Iris Scanners Can Be Tricked

At the Black Hat security conference in Las Vegas this week, Javier Galbally revealed that it’s possible to spoof a biometric iris scanning system using synthetic images derived from real irises. The Madrid-based security researcher’s talk is timely, coming on the heels of a July 23 Israeli Supreme Court hearing where the potential vulnerabilities of a proposed governmental biometric database drove the debate. Consider the week’s events a reminder that if the adoption of biometric identification systems continues apace without serious contemplation of the pitfalls, we’re headed for trouble.

When it comes to the collection and storage of individuals’ digital fingerprints, iris scans, or facial photographs, system vulnerability is a chief concern. A social security number can always be cancelled and reissued if it’s compromised, but it’s impossible for someone to get a new eyeball if an attacker succeeds in seizing control of his or her digital biometric information.

Among all the various biometric traits that can be measured for machine identification--such as fingerprints, face, voice, or keystroke dynamics--the iris is generally regarded as being the most reliable. Yet Galbally’s team of researchers has shown that even the method traditionally presumed to be foolproof is actually quite susceptible to being hacked.

The project, unveiled for the first time at the security researchers’ conference, made use of synthetic images that match digital iris codes linked to real irises. The codes, which are derived from the unique measurements of an individuals’ iris and contain about 5,000 pieces of information, are stored in biometric databases and used to positively identify people when they position their eyes in front of the scanners. By printing out the replica images on commercial printers, the researchers found they could trick the iris-scanning systems into confirming a match.

The tests were carried out against a commercial system called VeriEye, made by Neurotechnology. The synthetic images were produced using a genetic algorithm. With the replicas, Galbally found that an imposter could spoof the system at a rate of 50 percent or higher. A Wired article hit on the significance of this discovery:

“This is the first time anyone has essentially reverse-engineered iris codes to create iris images that closely match the eye images of real subjects, creating the possibility of stealing someone’s identity through their iris.”

This revelation not only exposes a security hole in a commercial iris-recognition system, but also proves that prominent tech firm and FBI contractor B12 Technologies--which is building a database of iris scans for the Next Generation Identification System--was wrong when it when it noted on its website that biometric templates “cannot be reconstructed, decrypted, reverse-engineered or otherwise manipulated to reveal a person’s identity.”

Any new detection of biometric system flaws is relevant in the context of the massive governmental identification programs moving forward at the global level. There’s India’s bid to create the world’s largest database of irises, fingerprints and facial photos, for example, and Argentina’s creation of a nationwide biometric database containing millions of digital fingerprints. Just this week in Israel, High Court justices criticized a planned biometric database as a “harmful” and “extreme” measure. Lawmakers who approve such identification schemes should give serious consideration to any new information surfacing about biometric system vulnerabilities.

Related Updates

Over the next few years, the Department of Homeland Security (DHS) plans to implement an enormous biometric collection program which will endanger the rights of citizens and foreigners alike. The agency intends to collect at least seven types of biometric identifiers, including face and voice data, DNA, scars, and tattoos...

Analyzing and indefinitely keeping the DNA profiles of thousands of Californians arrested for felonies, but never charged with a crime, is not just an ominously overbroad practice by law enforcement—it’s an invasion of privacy that violates the state’s constitution. Last year EFF and our co-counsel Michael Risher filed a ...

We urged the Florida Supreme Court yesterday to review a closely-watched lawsuit to clarify the due process rights of defendants identified by facial recognition algorithms used by law enforcement. Specifically, we told the court that when facial recognition is secretly used on people later charged with a crime, those...

Since even before he took office, President Trump has called for a physical wall along the southern border of the United States. Manydifferentorganizations have argued this isn’t a great idea. In response, some Congressional Democrats have suggested turning to surveillance technology to monitor the...

Today the Illinois Supreme Court ruled unanimously that when companies collect biometric data like fingerprints or face prints without informed opt-in consent, they can be sued. Users don't need to prove an injury like identity fraud or physical harm—just losing control of one’s biometric privacy is injury enough. In...

States are often the “laboratories of democracy,” to borrow a phrase from U.S. Supreme Court Justice Louis Brandeis. They lead the way to react quickly to technological advances, establish important rights, and sometimes pass laws that serve as a template for others across the country. This year, EFF worked—and fought—alongside...

Big tech companies are surveilling your face for profit. Fortunately, Illinois’ Biometric Information Privacy Act (BIPA) prohibits companies from harvesting and monetizing your biometrics, including a scan of your face measurements, without your informed opt-in consent. But now Facebook is asking the U.S. Court of Appeals for the Ninth...

After years of claiming self-regulation would keep them in line, big tech companies spooked by new state data privacy safeguards are now calling for a national privacy law—one that would roll back these vital state protections. We are one of sixteen consumer privacy and civil rights groups to...

San Francisco—The Electronic Frontier Foundation (EFF) launched a virtual reality (VR) experience on its website today that teaches people how to spot and understand the surveillance technologies police are increasingly using to spy on communities.“We are living in an age of surveillance, where hard-to-spot cameras capture our faces and...