Fundamental of Fingerprint Recognition

Fundamental of Fingerprint Recognition

Fingerprints are the tiny ridges, whorls and valley patterns on the tip of each finger. They form from pressure on a baby’s tiny, developing fingers in the womb. No two people have been found to have the same fingerprints — they are totally unique. There’s a one in 64 billion chance that your fingerprint will match up exactly with someone else’s. Fingerprints are even more unique than DNA, the genetic material in each of our cells. Although identical twins can share the same DNA — or at least most of it — they can’t have the same fingerprints.

What is Fingerprint Recognition?

Fingerprint identification is one of the most well-known and publicized biometrics. Because of their uniqueness and consistency over time, fingerprints have been used for identification for over a century, more recently becoming automated (i.e. a biometric) due to advancements in computing capabilities. Fingerprint identification is popular because of the inherent ease in acquisition, the numerous sources (ten fingers) available for collection, and their established use and collections by law enforcement and immigration.

History

There are records of fingerprints being taken many centuries ago, although they weren’t nearly as sophisticated as they are today. The ancient Babylonians pressed the tips of their fingertips into clay to record business transactions. The Chinese used ink-on-paper finger impressions for business and to help identify their children.

Today, digital scanners capture an image of the fingerprint. To create a digital fingerprint, a person places his or her finger on an optical or silicon reader surface and holds it there for a few seconds. The reader converts the information from the scan into digital data patterns. The computer then maps points on the fingerprints and uses those points to search for similar patterns in the database.