May 1, 2011

Minutiae reduction

Fingerprint use is as old as time itself.  Fingerprints have been found on Babylonian tablets seals and pottery dating back to the second millennium BCE.  By around 246 BCE Chinese officials were using fingerprints to seal documents.  Whether they yet knew that no two fingerprints are alike is not fully known but the records show that they did use fingerprints as a means of identification.  By 650 AD a Chinese historian mentioned that fingerprints could be used as a means of identification.  For more information see Wikipedia

The first modern version of a fingerprint database was created in 1892 in Argentina, and by use of fingerprints they were able to have a murderer confess to her murders.  Fingerprints were then used extensively by being imprinted on paper which is still used today in some places.  

Fingerprint recognition involves the basic matching of three major features: the whorl, the arch and the loop.  These features are distinguishable with the human eye and provide a general sense for fingerprint identification.  To really get into the detail however you need to observe the minutiae.  
Whorl Pattern
Loop Pattern

The most important characteristic of fingerprints are the minutiae, which is just a fancy word for fingerprint ridges at a local (small) scale.  When fingerprints are imprinted on paper and scanned inaccuracies can arise based on the entire procedure.

One thing that can reduce accuracy of fingerprint scanning is the accidental addition/ removal of minutiae.  This problem can occur for several different reasons including smudges or noisy regions, image contrast, or overall image quality.

Research has been done into how to create an algorithm (sequence of steps) to eliminate false minutiae or to add back in minutiae that were lost in the scan.  This research was very effective reducing the amount of false minutiae by about 40% as opposed to the crossing number method of extraction.

This improvement in the reduction of errors comes from the algorithm that the researchers developed.  This algorithm involves several steps for the processing after the scan that is the cause for the reduction of error. 

You may be wondering: Why should I care about an algorithm that scans fingerprints from paper when all fingerprints are now being scanned digitally? Good question, and the answer is that while yes, most scanning now occurs digitally, there are still lots of fingerprints that are on paper and not in digital form.  This article is relatively young being released in 2008, so it is still relevant.
Source: M.U. Akram et al. Fingerprint image: pre- and post- processing.  International journal of Biometrics. Volume 1 (2008).


  1. Even with digital finger print recognition I still ran into problems. My job in Afghanistan required use of a digital print reader and every day I'd run in to people that it would take a dozen tries to match the print.

  2. Carl's comment is interesting--it suggests there may be a gap between the promise of these technologies and their implementation.

    And you're right that they used biometrics to identify Osama, but it was the DNA that sealed the deal. It will be interesting to see if we ever really "trust" biometrics to be the final word.

  3. @ Carl- I did not know it was annoying, that is interesting. Since for the Biometric army post I used the army website as the source they only highlighted the positives.

    @ Jen- Good point, and in my opinion even as a fan of biometrics they will never truly replace DNA. Especially with how easy it is becoming to sequence DNA.