Last summer, the Article 29 Working Party—an advisory body formed under the EU Data Protection Directive—initiated an investigation into this issue in response to Facebook’s European launch of its face recognition technology. Given the many new face recognition applications subsequently launched, the opinion wisely does not focus on Facebook and instead provides general recommendations on how the EU Data Protection Directive applies to automatic face recognition in online and mobile applications.

The Directive requires EU countries to adopt privacy protections for the automatic processing of personal data, which according to the opinion includes both photos from which individuals can be identified and the measurements of their facial features. As face recognition technology automatically processes photos and measurements to identify or categorize individuals, it is subject to the Directive. A provider using automatic face recognition in its online service or mobile application must therefore notify the individuals that they will be identified with this technology and seek their permission.

The opinion clarifies that “informed consent” cannot be obtained simply by providing opt-out settings, although those settings are still important to ensure that individuals can easily retract their consent. Terms and conditions that discuss the face recognition process are also insufficient except if the main purpose of the application is face recognition. But a face recognition app may still need to get specific permission from the individuals in the photos if it uses photos or facial measurements from another application, such as a general-purpose social network. Notably, a person cannot consent to face recognition by simply uploading a photo to an application because the person may not anticipate that the photo will be used for this purpose and the photo could contain personal data of other individuals.

But the opinion also recognizes that strictly requiring informed consent would downright prohibit many novel uses of the technology. For example, a social network provider may not know whether an individual in a photo has consented to the automatic recognition until it identifies her. In that situation, the provider may initially process the photo to determine whether the individual has consented, but must delete all the resulting data if it turns out that there was no consent from that individual. The provider may also have to encrypt its data if that is necessary for its security.

While clarifying how the Directive applies to the novel uses of face recognition in online and mobile applications, the opinion seeks to provide some flexibility to avoid banning certain applications. As it does not focus on any particular application, it may potentially also provide guidance for future innovation in this area. Stay tuned as EFF continues covering this issue!