Researchers Matched Images on Tattoo Websites to a German Police Database

For the last year, EFF has been battling to free records from the National Institute of Standards and Technology (NIST) regarding an ethically dubious research program to promote the development of automated tattoo recognition technology. The agency is months delinquent in providing a variety of information, most notably the list of 19 research entities who received a giant set of tattoo images obtained from prisoners in custody. This delay is particularly alarming as NIST is currently recruiting institutional participants for the next stage of its expanded research, scheduled to begin on Dec. 1.

What we’ve discovered so far about NIST’s approach to tattoo identification raises major concerns for privacy, free speech, the freedom to associate, and the rights of research subjects. We’ve also learned that similar tattoo recognition experiments are being conducted in Germany, a country that is usually sensitive to personal privacy.

One of our chief concerns is how automated tattoo recognition technology—algorithms that can match tattoos—can be used to identify and track people in a similar way to how facial recognition systems are being deployed by law enforcement entities. We foresee a future where this technology is used to scan tattoo images on the Internet as a form of surveillance.

A presentation recently released in response to our Freedom of Information Act request confirms that this is not only possible but that researchers in Germany have already used tattoo recognition technology to compare online tattoo images to law enforcement records.

Although we still don’t have the full list of 19 organizations who participated in the FBI-sponsored Tattoo Recognition Technology Challenge (Tatt-C), we do know that the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation, a German research body, was one of them.

As part of Tatt-C, NIST and the FBI provided research entities with more than 15,000 images “operationally collected” by law enforcement and asked researchers to run a series of trials using their proprietary tattoo-analysis algorithms. Fraunhofer reported that it was able to match two images of the same tattoo taken over a period of time with 96.8% accuracy.

But that wasn’t the only the tattoo research Fraunhofer has conducted.

According to presentation delivered at a NIST event these researchers scraped images from at least two commercial tattoo websites and then used automated algorithms to match those tattoos to a much larger set of images obtained from the German police.

The presentation shows that researchers grabbed 8,400 images from Tattoodesign.com, an online resource for people looking for tattoo ideas, and another 848 images from Wildcat.de, an online store for the body art community. Those images were then combined with 330,000 images from the Federal Criminal Police Office of Germany.

One particular slide showed researchers had not fully thought through the propriety of scraping private databases:

The researchers acknowledge that there may have been privacy issues with scraping commercial databases. They claim that the German police images “were not considered personal data, however…” But the “however" was never followed up with an answer in the slide deck. A video of the various Tatt-C presentations that may have shown the follow-up has been removed from the NIST website.

Fraunhofer’s position on police images reflects a troubling and contradictory assumption that was also adopted by NIST. On one hand, researchers claim that tattoos are not considered personally identifiable information. On the other, they argue that tattoos are a useful biometric for identifying people. As a hypothetical example in the presentation, Fraunhofer used a tattoo on a woman’s foot as a way to identify a missing person.

This development illustrates why it’s in the public interest for NIST to release the list of 19 entities who participate in Tatt-C: the companies and research institutions involved may be engaged in activities that deserve greater scrutiny.

EFF’s research into NIST’s program revealed how the research had not gone through the proper ethical review until after the research had been completed. Our report also found that personally identifiable information had been inappropriately released and published, a fact admitted by NIST in response to our report. NIST subsequently redacted many of the tattoo images from the presentations it published on its website.

EFF raised First Amendment concerns, since the experiments, often using religious imagery, were designed to show how technology could use tattoos to establish associations between subjects. NIST also scrubbed claims in its documentation that tattoo recognition would be useful in identifying people’s religious and ritualistic beliefs.

As a public entity, NIST has a responsibility to be transparent and follow ethical guidelines. We call upon the U.S. Department of Commerce, which has jurisdiction over NIST, to release documents in response to our FOIA request immediately. German citizens may want to explore what precautions Fraunhofer and the German federal police are taking with their own tattoo experiments.

Related Updates

Hiperderecho, the leading digital rights organization in Peru, in collaboration with the Electronic Frontier Foundation, today launched its second ¿Quien Defiende Tus Datos? (Who Defends Your Data?), an evaluation of the privacy practices of the Internet Service Providers (ISPs) that millions of Peruvians use every day. This year's...

The California Consumer Privacy Act (CCPA) requires the California Attorney General to take input from the public on regulations to implement the law, which does not go into effect until 2020. The Electronic Frontier Foundation has filed comments on two issues: first, how to verify consumer requests to companies for...

Ever since the Cambridge Analytica scandal last summer, consumer data privacy has been a hot topic in Congress. The witness table has been dominated by the biggest platforms, with those in lockstep with the tech giants earning the vast majority of attention. However, this week marked the first time that...

We urged the Florida Supreme Court yesterday to review a closely-watched lawsuit to clarify the due process rights of defendants identified by facial recognition algorithms used by law enforcement. Specifically, we told the court that when facial recognition is secretly used on people later charged with a crime, those...

In his latest announcement, Facebook CEO Mark Zuckerberg embraces privacy and security fundamentals like end-to-end encrypted messaging. But announcing a plan is one thing. Implementing it is entirely another. And for those reading between the lines of Zuckerberg’s pivot-to-privacy manifesto, it’s clear that this isn’t just about privacy. It’s...

In back-to-back hearings last week, the House and the Senate discussed what, if anything, Congress should do about online privacy. Sounds fine—until you see who they invited. Congress should be seeking out multiple, diverse perspectives. But last week, both chambers largely invited industry advocates, eager to...

San Francisco - Technology is supposed to make our lives better, yet many big companies have products with big security and privacy holes that disrespect user control and put us all at risk. The Electronic Frontier Foundation (EFF) is launching a new project called “Fix It Already!” demanding repair...

Today we are announcing Fix It Already, a new way to show companies we're serious about the big security and privacy issues they need to fix. We are demanding fixes for different issues from nine tech companies and platforms, targeting social media companies, operating systems, and enterprise platforms on...

Update, 2:35 p.m.: The coalition of groups behind Privacy for All has grown since time of publishing. This update reflects the latest count. Privacy is a right. It is past time for California to ensure that the companies using secretive practices to make money off of our personal information treat...