Original reporting and feature articles on the latest privacy developments

Should the Facial Recognition Code Apply to the Gov’t? Could It?

Yesterday, stakeholders met for the sixth in their series of meetings organized by the National Telecommunications and Information Administration (NTIA) in hopes of creating a voluntary code of conduct on facial recognition technology. This meeting aimed to look at the risks and issues the process’ participants identified since last month’s meeting. It also looked at a list of drafted definitions the not-yet-existent code could include.

The meeting followed a familiar narrative in this process: The technology is so new that it’s sometimes difficult to imagine the ways in which it’s currently being and will be used. So creating rules that would govern those real and imaginary uses is pretty difficult.

The most passionate debate yesterday centered around what the code should say about government access to raw images and what standards should apply to requests by governments to gain access to such information. The stakeholders—a group of representatives from government, the ACLU, the Consumer Federation of America, NetChoice and the Application Developers Alliance (ADA), among others—were divided on whether to even address the issue.

Tim Sparapani, representing the ADA, suggested the group not even go there.

“I’m loathe to take on the federal government unless you can do something significant and meaningful,” he said. However, he argued, the code needs to address what to do in the case that the government is a customer of a commercial entity.

The NTIA’s John Morris said he thought the code would be for consumer-facing products and services. If it’s the case that the government would be covered, the code would face an uphill battle.

“I don’t think those with government customers would participate,” he said. “My view, the NTIA’s view, is the same as it was six or eight months ago, which is to say that government use of this technology, we view as out of scope. We don’t think the FTC is going to be exerting jurisdiction over how the government uses this technology.”

That’s a nightmare for developers, Sparapani said. Whatever the code says, it shouldn’t apply differently to those with government customers and those without.

“A startup is going to want to do it once and well and have the same set of rules apply to all of their customers they can anticipate coming through the door,” he said. “I don’t think it’s a small question. I think people would prefer one unified system.”

Joni Lupovitz of Common Sense Media said looking at how to handle the government-as-customer now would halt progress. Push forward with the code as it relates to consumer-facing and commercial uses of data, and then maybe circle back later.

That subject exhausted, discussion turned to semantics. At what point should notice and consent happen?

The ACLU’s Chris Calabrese said “enrollment should be the lynchpin,” but NetChoice’s Steve DelBianco said that doesn’t work. Notice should be given when metadata is added to a facial recognition template, making it identifiable. And it’s at that point that the individual should be given the opportunity to opt out of enrollment in a facial recognition database.

Say, for example, you want to monitor entrance into a building and you use facial recognition templates to ensure proper access. Identification and verification are different than database enrollment and sharing.

“Enrollment could be to ensure the same delivery guy comes in every day at 2 p.m. so enrollment is like saving (an image),” DelBianco said. “The user of the system took a template and enrolled it. That’s different from saving. They might have compared it with other residents of the building to see if they should be able to get in. What level of transparency do we give to the subject about those two activities?”

Susan Grant of the Consumer Federation of America said she thought “storage” was the same as “enrollment,” and that the group had decided that as soon as storage happened, consent should be required.

Calabrese said his concerns revolve around when a person is enrolled into a database—regardless of whether the image is shared.

“For my mind, if I’m taking an image, I’m turning it into a template,” he said. “That raises all these issues at that point. That’s the logical use for notice, consent and transparency.”

NetChoice’s DelBianco said it may be logistically impossible to be completely transparent at the time of “enrollment.” How do you notify every image subject that their image has been taken at the time of the capture?

“Practically speaking, that’s going to be very challenging,” he said.

The group discussed defining terms, including “personally identifiable information,” “encryption” and “authentication.”

Bill Baker of Wiley Rein said he’s nervous about using the term “personally identifiable information” in the code because it’s defined differently in every state statute across the U.S.

Finally, there was something of an end-around: Walter Hamilton said the International Biometrics Industry Association, the group he was there to represent, is two weeks away from publishing a best practices code for the stakeholder group to review.

That was good news to Bill Long, who said this code-drafting process needs more voices.

“I think it’s great that biometrics are involved,” he said. “We need users. We need Home Depot, Sears and the International Association of Shopping Malls” for the business perspective.

Carl Szabo of NetChoice proposed the group spend one more meeting fleshing out details and then get down to writing a code.

Written By

Angelique Carson, CIPP/US

0 Comments

If you want to comment on this post, you need to login

Related

Google has been given leave to appeal a decision that users can claim damages for a breach of the UK Data Protection Act (DPA). The Supreme Court ruled on Tuesday that the Google v. Vidal-Hall case, referred to by IAPP VP of Research and Education Omer Tene as the "European Privacy Judicial Decision of a Decade," can go back to court yet again
Read more

Given what they saw as a lack of regulations to protect consumers against potential harms as a result of increasingly pervasive and surreptitious online tracking, college buddies Chandler Givens and Ryan Flach have decided to do something about it themselves. Last week, they launched TrackOFF, software designed to allow consumers to combat digital tracking from their own computers.
Read more

Next week, Ellen Giblin, CIPP/C, CIPP/G, CIPP/US, will start the job she’s been waiting for most of her adult life. But the fact that she’s landed a position there is in no way accidental. She’s been very strategic about each line she’s added to her resume.
Read more

Whether you are a privacy professional practicing in the EU or not, you’ve probably been watching the headlines this summer about the EU’s General Data Protection Regulation (GDPR) and the ongoing trilogue process. After all, the GDPR is expected to have far-reaching implications for organizations—and anyone who works in privacy—well beyond the EU’s borders. It’s probably not a surprise, then, that the IAPP Europe Data Protection Congress 2015 will feature keynotes and educational sessions to help you prepare for the changes the GDPR is sure to bring with it.
Read more

In June, mobile identity company TeleSign commissioned a study on consumers’ concerns about online security and their exposure to breaches. It found that, amidst increasing reports of well publicized breaches, 80 percent of consumers are worried about their online security and 40 percent have experienced a security incident within the past year. It also found that 73 percent of online accounts use the duplicated passwords and more than half of consumers use five or fewer passwords across their entire online life. Given statistics like those, TeleSign has launched a campaign aimed at educating consumers on what it says is the future of mobile identity, two-factor authentication.
Read more

Tags

The IAPP is the largest and most comprehensive global information privacy community and resource. Founded in 2000, the IAPP is a not-for-profit organization that helps define, support and improve the privacy profession globally.Learn more

The IAPP is the only place you’ll find a comprehensive body of resources, knowledge and experts to help you navigate the complex landscape of today’s data-driven world. We offer individual, corporate and group memberships, and all members have access to an extensive array of benefits.