I'm a privacy pragmatist, writing about the intersection of law, technology, social media and our personal information. If you have story ideas or tips, e-mail me at khill@forbes.com. PGP key here.
These days, I'm a senior online editor at Forbes. I was previously an editor at Above the Law, a legal blog, relying on the legal knowledge gained from two years working for corporate law firm Covington & Burling -- a Cliff's Notes version of law school.
In the past, I've been found slaving away as an intern in midtown Manhattan at The Week Magazine, in Hong Kong at the International Herald Tribune, and in D.C. at the Washington Examiner. I also spent a few years traveling the world managing educational programs for international journalists for the National Press Foundation.
I have few illusions about privacy -- feel free to follow me on Twitter: kashhill, subscribe to me on Facebook, Circle me on Google+, or use Google Maps to figure out where the Forbes San Francisco bureau is, and come a-knockin'.

The 7 Dos And Don'ts Of Facial Recognition

Facial recognition technology has moved out of the world of science fiction and into reality. There are billboards that scan your face to decide which ads to show you, and there’s talk of Kinect being used to decide which ads to show you at home based on your emotional state. There’s an app for detecting people’s faces at bars to help you decide where you want to get your drink on. There’s a Facebook app that lets you use your face to collect discounts at stores. There’s a dating service for the narcissistic that will set you up with someone whose face looks like yours. And of course, Facebook, Google and Apple employ the technology to help you more quickly tag friends in photos.

Facebook recently had to suspend the practice in Europe after privacy regulators freaked out about the social network’s growing biometric database of its citizens. Here in the U.S., we’re more gingerly approaching the technology, regulation-wise. Senator Al Franken has been vocal about his concerns about the growing use of facial recognition technology by everyone from Facebook to the F.B.I. Now, the Federal Trade Commission is stepping in with some recommendations for best practices for businesses getting into faceprint collection. They are ”intended to provide guidance to commercial entities” but are not binding. Here’s a breakdown of their dos and don’ts:

Don’t let people steal your customers’ faces. If you’re making biometric face-prints of people’s faces, do make sure that data is secure and can’t be scraped. (After all, have you seen what 3D printers can do these days?)

Don’t hold onto people’s faces forever. From the FTC: “For example, if a consumer creates an account on a website that allows her to virtually “try on” eyeglasses, uploads photos to that website, and then later deletes her account on the website, the photos are no longer necessary and should be discarded.”

Don’t be trying to identify people where they don’t want to be identified. The FTC recommends against putting facial recognition cameras “in bathrooms, locker rooms, health care facilities, or places where children congregate.” Let’s add strip clubs, XXX stores, and Justin Bieber concerts to that list.

Do include a sign letting people know their faces are being scanned. Says the FTC: “Companies using digital signs capable of demographic detection – which often look no different than digital signs that do not contain cameras – should provide clear notice to consumers that the technologies are in use, before consumers come into contact with the signs.” That one is a little tough. Rule of thumb is usually that if you can see the camera, it can see you.

Social networks do… Facebook needs to make it easy for people to turn facial recognition off. The FTC says “social networks” here, but we all know they’re talking about Facebook: “Social networks should also provide consumers with (1) an easy to find, meaningful choice not to have their biometric data collected and used for facial recognition; and (2) the ability to turn off the feature at any time and delete any biometric data previously collected from their tagged photos.” Many a critic has complained that it’s too hard to do this on F-book currently. If you go to Facebook help center and ask it how to “turn off facial recognition,” you’re out of luck.

Do get people’s permission to take their faceprint if you collected it to tell them how pretty they are, but then use it to help event organizers invite the prettiest people to their events. From the FTC: “[Companies] should obtain a consumer’s affirmative express consent before using a consumer’s image or any biometric data derived from that image in a materially different manner than they represented when they collected the data.”

Don’t invent the recognize-strangers-using-facial-recognition app unless you have a way to get permission from those that would be identified. Says the FTC:

“[C]ompanies should not use facial recognition to identify anonymous images of a consumer to someone who could not otherwise identify him or her, without obtaining the consumer’s affirmative express consent. Consider the example of a mobile app that allows users to identify strangers in public places, such as on the street or in a bar. If such an app were to exist, a stranger could surreptitiously use the camera on his mobile phone to take a photo of an individual who is walking to work or meeting a friend for a drink and learn that individual’s identity – and possibly more information, such as her address – without the individual even being aware that her photo was taken. Given the significant privacy and safety risks that such an app would raise, only consumers who have affirmatively chosen to participate in such a system should be identified.”

Well, when you describe it like that, it sounds awful! The version of this that might actually appeal to users would allow someone in the bar to take a photo of another someone and do a background check to make sure he’s not a total creep (or the stalker type described).

Google has said it already has the technology to do this, but it’s holding back out of privacy concerns. In the meantime, I’m still waiting for the Google alert product that will scan the web and let you know when photos or videos of you have been uploaded, similar to what they currently offer when it comes to mentions of your name. That’s the irony here: facial recognition tech could be used to both invade your privacy and help you to better protect it.

FTC Commissioner Thomas Rosch was not on board with the non-binding best practices, issuing a dissenting opinion in which he suggests that the report “goes too far, too soon.” For one, he thinks the FTC is wrong to suggest best practices when the tech hasn’t been misused yet.

Post Your Comment

Post Your Reply

Forbes writers have the ability to call out member comments they find particularly interesting. Called-out comments are highlighted across the Forbes network. You'll be notified if your comment is called out.

FTC’s Rosch says that about every regulation — it’s like he has a macro that automatically generates it. So instead, let’s wait for all the data to get out there and be irretrievable, and then try to do something feeble about it.

We’ve seen that happen with spam and identity theft. How’s that working for everyone?

Basically FTC’s rules are sound and boil down to two very basic privacy guidelines: Don’t retain data for longer than you need it, and don’t use it for other stuff besides its primary purpose. Also: Let users control how it’s used.

They deserve a shout-out for being proactive with guidelines. Non-binding guidelines open up some legitimate debate before the pro-stalking lobby gets to interfere. Then when someone does step over the line, binding regulation should be more defensible.

It’s better to have established best practices before the technology becomes abused. Society is decreasingly sensitive to such passive intrusion that 1) The abuse will have been happening long before it’s brought to light (or is already happening), and 2) the general attitude will have become one of acceptance.

I do think these best practices should remain of a non-binding nature, however; the abuse of enforcement outweighs the abuse of a technology.

Kashmir great article as usual, nothing like reading an article like this before a 12.5 hour day of classes and homework, to get my brain pumping. Couple of comments:

1. Do you (or anyone else) have any timeline of when facial recognition software, is going to be appearing more in the world?

2.Just because the FTC “is stepping in” , what does that mean for countries abroad? I think it is safe to say that a person or a company would put up this type of software and technology, let’s say in a country like Afghanistan, to find a person that they may believe has WMD.

3. Everyone that has read this article, I also recommend the following article:

Kashmir, after reading this second article, I was wondering if you are familiar with Wal-Marts check cashing arrangement. Pretty much you bring in your check, show the clerk a valid ID, then on this little machine, I believe is also used for swiping credit and debit cards, the customer inputs their social security number. Now as I was doing this one day at a Midwest Wal Mart, I noticed that a camera was aimed in the over head direction of the machine. That got me thinking of: the person who watches those cameras, now has the customers SS number, in addition to majority of the time an individuals drivers license. When I asked the clerk about why their was a camera overhead, directly looking at the machine, she informed me it was for the protection of the store and customers. But she informed me that she, be the person on the other side of the counter could not see my number, at that point I realized that thousands of people a day across America, use Wal Mart to cash their checks, and that means a lot of people are being put at risk, when they may not think about. All it takes is one disgruntled employee, and a day’s worth of footage, and now they have a lot of information on innocent people. I was just wondering if you had ever thought about this or came across it in the past as I read the article.

Thanks, Joel. There are unfortunately many companies out there with bad internal information protection practices. Rogue employees can bring a world of pain. This happens with hospitals, for example, when an employee will leak a celeb’s medical file to the press. At other companies, bad apples have taken advantage of their access to customer accounts to commit ID fraud. Keep your fingers crossed that the people you deal with are trust-worthy!