Benchmark area: Fingerprint Verification

This benchmark area contains fingerprint verification benchmarks. Fingerprint verification consists in comparing two fingerprints to determine whether they are impressions of the same finger or not (one-to-one comparisons). Algorithms submitted to these benchmarks are required to enroll fingerprints into proprietary or standard templates and to compare such templates to produce a similarity score.


Currently, this benchmark area contains the following benchmarks:

  • FV-TEST: A simple dataset useful to test algorithm compliancy with the testing protocol (results obtained on this benchmark are only visible in the participant private area and cannot be published).
  • FV-STD-1.0: Contains fingerprint images acquired in operational conditions using high-quality optical scanners. Results should reflect the expected accuracy in large-scale fingerprint-based applications.
  • FV-HARD-1.0: Contains a relevant number of difficult cases (noisy images, distorted impressions, etc.) that makes fingerprint verification more challenging. Results do not necessarily reflect the expected accuracy in real applications but allow to better discriminate the performance of various fingerprint recognition algorithms.

The table below reports the main characteristics of each benchmark:

Benchmark Scanner Type Resolution Minimum Image Size Maximum Image Size Genuine Attempts Impostor Attempts
FV-TEST Optical 500 dpi 440x500 440x500 280 45
FV-STD-1.0 Optical 500 dpi 440x500 440x500 27720 87990
FV-HARD-1.0 Optical 500 dpi 260x374 448x500 19320 20850

Some sample images with the same format used in the benchmarks of this area are available in the download page.

The following sections report the testing protocol and the performance indicators common to all benchmarks in this area.


Each participant is required to submit, for each algorithm, two executables in the form of Win32 console applications.

  • Both executables will take the input from command-line arguments and will append the output to a text file.
  1. The first executable (enroll.exe) enrolls a fingerprint image and produces a template file; the command-line syntax is:
    enroll.exe <imagefile> <templatefile> <outputfile>
    imagefile the input image pathname
    templatefile the output template pathname
    outputfile the output text-file, where a log string (of the form imagefile templatefile result) must be appended; result is "OK" if the enrollment can be performed or "FAIL" if the input image cannot be processed by the algorithm

  2. The second executable (match.exe) matches two fingerprint templates and produces a similarity score; the command-line syntax is:
    match.exe <templatefile1> <templatefile2> <outputfile>
    templatefile1 the first input template pathname
    templatefile2 the second input template pathname
    outputfile the output text-file, where a log string (of the form templatefile1 templatefile2 result similarity) must be appended; result is "OK" if the matching can be performed or "FAIL" if the matching cannot be executed by the algorithm; similarity is a floating point value ranging from 0 to 1 which indicates the similarity between the two templates: 0 means no similarity, 1 maximum similarity
  • Both executables have to operate only on the explicitly-given inputs, without exploiting any learning technique or template consolidation/update based on previous enrolls/matches.
  • C, C# and Matlab language skeletons for enroll.exe and match.exe are available in the download page to reduce the participants implementation efforts.


During test execution the following constraints will be enforced:

Benchmark Maximum time for each enroll Maximum time for each match Maximum template size Memory allocation limit for enroll and match processes
FV-TEST 5 seconds 3 seconds No limit No limit
FV-STD-1.0 5 seconds 3 seconds No limit No limit
FV-HARD-1.0 5 seconds 3 seconds No limit No limit

Each enrollment or matching attempt that violates one of the above constraints results in a failure to enroll or failure to match, respectively.

The following time breaks are enforced between two consecutive submissions to the same benchmark by the same participant.

Benchmark Minimum break
FV-TEST 12 hour(s)
FV-STD-1.0 30 day(s)
FV-HARD-1.0 30 day(s)

Performance Evaluation

For each algorithm, genuine (matching two fingerprints of the same finger) and impostor (matching two fingerprints originating from different fingers) attempts are performed to compute False Non Match Rate FNMR (also referred as False Rejection Rate - FRR) and False Match Rate FMR (also referred as False Acceptance Rate - FAR).

Although it is possible to reject images in enrollment, this is strongly discouraged. In fact, rejection in enrollment is fused with other error rates for measuring the final accuracy; in particular, each rejection in enrollment will produce a "ghost" template which will not match (matching score 0) with all the remaining templates.

For each algorithm the following performance indicators are reported:

  • REJENROLL (Number of rejected fingerprints during enrollment)
  • REJNGRA (Number of rejected fingerprints during genuine matches)
  • REJNIRA (Number of rejected fingerprints during impostor matches)
  • EER (equal-error-rate)
  • FMR100 (the lowest FNMR for FMR1%)
  • FMR1000 (the lowest FNMR for FMR0.1%)
  • FMR10000 (the lowest FNMR for FMR0.01%)
  • ZeroFMR (the lowest FNMR for FMR=0%)
  • ZeroFNMR (the lowest FMR for FNMR=0%)
  • Average enrollment time
  • Average matching time
  • Average and maximum template size
  • Maximum amount of memory allocated
  • Impostor and Genuine score distributions
  • FMR(t)/FNMR(t) curves, where t is the acceptance threshold
  • DET(t) curve
For information or suggestions: fvcongoing@csr.unibo.it Copyright © 2024 Biometric System Laboratory