Search form

When Apple introduced the iPhone X on September 12th, 2017 at the Steve Jobs Theater, the company revealed a facial recognition technology it calls 'Face ID'.

Face ID was created initially for the iPhone X because the device doesn't have that famous rounded physical button at the bottom due to its no-bezel appearance. Since Apple has yet to implement a touch screen display that works with Touch ID, Face ID is to replace Touch ID in the iPhone X.

Just like Touch ID, Face ID allows users to unlock Apple devices, make purchases in the various Apple digital media stores, and authenticate Apple Pay online or in apps.

How Face ID works, is by projecting more than 30,000 invisible infrared dots onto a face and producing a 3D mesh. To do this, the technology uses "true depth camera system", including an infrared camera, flood illuminator, dot projector, and proximity sensor.

To set one up, Apple showed that the user needs to show all angles of his/her face during setup. This is to make sure that Face ID can create the 3D face mesh it needs.

The invisible infrared light helps Face ID to identify the face of the phone's user when the surrounding is dimly lit. The dot projector then makes more than 30,000 invisible infrared dots projected to the user's face to build a unique but identifiable facial map from the shape and contours of the face.

Then the infrared camera reads that dot patterns by capturing the infrared image.

With neural networks designed to work like a human brain, the information is then used to create a mathematical model of the user's face using the dot pattern and stores this model in a "secure enclave" on the iPhone X itself. For security reasons, the facial recognition information is stored locally and is not stored in the cloud.

This Face ID technology boasts a built-in neural engine and TrueDepth to perform the computational heavy lifting required to process users' faces. This allows the technology to learn about the user's face and increase security. The more the TrueDepth system is used, the more it understand's the user's face. For example. it can identify face shapes regardless of changes to skin tone, hairstyles, growing a beard, and whether the user is wearing glasses or a hat,

It also requires the "user attention". So basically, this means users have to have their eyes open and focused to the phone to unlock using Face ID.

And to protect against spoofing and hardware hacks, Apple's biometric systems are automatically disabled after a predetermined number of unsuccessful attempts. Touch ID, for example, gives users five attempts to authenticate with their registered finger before requiring a passcode. Apple's Face ID allows only two tries before being disabled.

As of 2016, facial recognition is still not effective for most applications even though the accuracy has been improved. Although the systems are often advertised as having accuracy near 100 percent (just like Apple with Face ID), this can be misleading as studies that were used usually use smaller sample sizes than would be necessary for large scale applications.

Because facial recognition is not completely accurate, it creates a list of potential matches.

Apple acknowledges this. While Apple said that the probability of someone else unlocking an iPhone using Face ID is 1 in 1,000,000, as opposed to Touch ID at 1 in 50,000, Apple insists that if a user has a twin, he/she may want to use the traditional passcode security instead.

So again, while Face ID or Touch ID are great on their own, there is still no replacement for passwords, at least yet.

The future of biometrics is multimodal. What this means, while the market for biometric identity verification certainly will increase, but one technology will not beat all of the others. Instead, biometric technologies and other security measures should be integrated with each other depending on the application and specific use.