Category: Biometrics

A legislative approach that defines as ‘sensitive’ any biometric measurement shows a lack of common sense and understanding of the science.

A better approach would be to protect those aspects of sensitive personal information (eg sexuality, political opinion, racial / ethnic origin) collected by any means, making legislation independent of technology.

An interesting paper was published in the most recent International Journal of Biometrics. Finnish scientists have developed a biometric measure using saccade eye movements. Saccade eye movements are the involuntary eye movements when both eyes move quickly in one direction. Using a video camera to record movement, this biometric measure can be highly correlated to an individual.

What is important is that there are large numbers of these life (bio) measurements (metrics) being discovered as scientists look more closely at human physiology and behaviour.

The use of biometric identification technologies sees biometric information (eg eye movement) converted into a series of digits (a hash), which can be statistically compared against another series of digits that have been previously collected during the enrolment of an individual to use a system (eg building access control). A biometric ‘match’ is a comparison of the number derived from the collection of a biometric during enrolment with the number that is elicited during verification. In the real world, these ‘numbers’ are nearly always slightly different. The challenge is to make a system able to allow an individual to get a match when he/she seeks verification and to ensure that the bad guy is repelled.

Generally speaking, biometric identity systems are not primarily designed to determine information that might be used to elicit sensitive personal information. Nor is it practical to reverse-engineer the biometric because of the intentional use of one-way mathematical functions and the degradation of data quantity collected. This means that one person would be hard pressed to elicit any information that might be used to discriminate against another with access to this series of digits.

The word ‘biometric’ seems to send shivers down the spines of some privacy advocates. I suggest it is because most, if not all, are not scientists but lawyers. But these biometric systems are just the current technology. Many critics of biometrics forget that like any tool, it depends on how it is used. The old saying that fire is a ‘good servant, but a bad master’ is equally true of biometrics.

What seems lacking in common sense is that legislation in several countries (including in Australia) puts up a barrier for the use of biometrics for purposes that protect the privacy and safety of people and organisations.

The information that a biometric collects is not necessarily sensitive information –I don’t really care if you know how often I blink. In fact, a photo of me is more likely to give you information about me that I am sensitive about.

The danger with this approach is that people focus on the technology being ‘bad’ and not on the fact that it is the sensitive information which is potentially harmful. Biometrics can be privacy enhancing, particularly as they can add additional layers to securing claims about identity and be used to protect individuals and organisations from becoming victims of identity fraud.

Disaggregating biometrics from ‘sensitive information’ and considering technology on the basis of what (sensitive information – gender, medical information, religious affiliation etc) it collects about an individual would more appropriately provides a course of protecting personal information. This of course would avoid stifling the practical application of technology.