Login

Benchmark area: Single-image Morph Attack Detection

This benchmark area contains face morphing detection benchmarks. Morphing detection consists in analyzing a face image to determine whether it is the result of a morphing process (mixing faces of two subjects) or not. Algorithms submitted to these benchmarks are required to analyze a suspected morph image and produce a score representing the probability of the image to be morphed.

Benchmarks

Currently, this benchmark area contains the following benchmarks:

  • SMAD-TEST: A simple dataset useful to test algorithm compliancy with the testing protocol (results obtained on this benchmark are only visible in the participant private area and cannot be published).
  • SMAD-BIOLAB-1.0: A dataset containing high-resolution face images with frontal pose, neutral expressions and good illumination. The morphed images have been generated from manually selected landmarks, and finally manually retouched. The dataset contains the morphed images used in [1] and the genuine images used to generate them (some images contribute to different morphings).
  • SMAD-MORPHDB_D-1.0: A dataset of high-quality images, with frontal pose, natural expression and good illumination. The morphed images have been generated from automatically detected landmarks, and finally manually retouched. The dataset is described in [2].
  • SMAD-MORPHDB_P&S-1.0: The dataset contains the same images of SMAD-MORPHDB_D-1.0, printed on high quality photographic paper by a professional photographer and scanned at 300 DPI. The dataset is described in [2].
  • SMAD-SOTAMD_D-1.0: A dataset containing high-resolution digital face images with frontal pose, natural expression and good illumination collected in the SOTAMD project [3].
  • SMAD-SOTAMD_P&S-1.0: A dataset containing high-resolution printed and scanned face images with frontal pose, natural expression and good illumination collected in the SOTAMD project [3].
  • SMAD-SOTAMD_PM_D-1.0: A dataset containing a subset of the high-resolution digital face images with frontal pose, natural expression and good illumination collected in the SOTAMD project [3]. In particular, it contains only manual post-processed morphed images.
  • SMAD-SOTAMD_UC_P&S-1.0: A dataset containing a subset of the high-resolution printed and scanned face images with frontal pose, natural expression and good illumination collected in the SOTAMD project [3]. In particular, it contains only uncompressed images.

The table below reports the main characteristics of each benchmark:

Benchmark Format Morphing Factor Minimum Eye Distance Maximum Eye Distance Bona Fide Attempts Morphing Attempts
SMAD-TEST Digital ~[0.4;0.5] 70 160 10 10
SMAD-BIOLAB-1.0 Digital ~[0.4;0.5] 90 140 88 80
SMAD-MORPHDB_D-1.0 Digital ~[0.3;0.4] 80 310 130 100
SMAD-MORPHDB_P&S-1.0 Printed & Scanned ~[0.3;0.4] 90 150 130 100
SMAD-SOTAMD_D-1.0 Digital 0.3 and 0.5 90 1020 300 2045
SMAD-SOTAMD_P&S-1.0 Printed & Scanned 0.3 and 0.5 80 170 1096 3703
SMAD-SOTAMD_PM_D-1.0 Digital 0.3 and 0.5 90 1020 300 470
SMAD-SOTAMD_UC_P&S-1.0 Printed & Scanned 0.3 and 0.5 90 120 200 380

The following sections report the testing protocol and the performance indicators common to all benchmarks in this area.

Protocol

Participants can provide each algorithm in two different forms:

    Win32 console application

  • The executable (detectMorph.exe) will take the input from command-line arguments and will append the output to a text file. It evaluates a single face image and produces a morph score; the command-line syntax is:
    detectMorph.exe <suspectedmorphfile> <label> <outputfile>
    where:
    suspectedmorphfile the suspected morph face image pathname. The supported image formats are: .BMP, .JPG, .PNG., .JP2, .JPF
    label an integer describing the format of the input image:
    - unknown photo format/origin (0),
    - non-scanned digital photo (1),
    - a photo that is printed, then scanned (2).
    outputfile the output text-file, where a log string (of the form suspectedmorphfile result isMorph score) must be appended.
    - result is "OK" if the detection can be performed or "FAIL" if the detection cannot be executed by the algorithm;
    - isMorph is "TRUE" if the image contains a morph or "FALSE" otherwise;
    -score is a floating point value ranging from 0 to 1 representing how confident the algorithm is that the image contains a morph: 0 means certainty that image does not contain a morph and 1 represents certainty that image contains a morph.

  • The executable has to operate only on the explicitly-given inputs, without exploiting any learning technique or template consolidation/update based on previous comparisons.
  • C, C# and Python language skeletons for detectMorph.exe are available in the download page to reduce the participants implementation efforts.

    Linux dynamically-linked library

  • Specific guidelines describing how to submit a NIST FRVT MORPH algorithm to FVC-onGoing are available in the download page. Please read them carefully before submitting your algorithm.

Constraints

During test execution the following constraints will be enforced:

Benchmark Maximum time for each detection Memory allocation limit for detection process
SMAD-TEST 10 seconds No limit
SMAD-BIOLAB-1.0 10 seconds No limit
SMAD-MORPHDB_D-1.0 10 seconds No limit
SMAD-MORPHDB_P&S-1.0 10 seconds No limit
SMAD-SOTAMD_D-1.0 10 seconds No limit
SMAD-SOTAMD_P&S-1.0 10 seconds No limit
SMAD-SOTAMD_PM_D-1.0 10 seconds No limit
SMAD-SOTAMD_UC_P&S-1.0 10 seconds No limit

Each detection attempt that violates one of the above constraints results in a failure to detect.

The following time breaks are enforced between two consecutive submissions to the same benchmark by the same participant.

Benchmark Minimum break
SMAD-TEST 12 hour(s)
SMAD-BIOLAB-1.0 30 day(s)
SMAD-MORPHDB_D-1.0 30 day(s)
SMAD-MORPHDB_P&S-1.0 30 day(s)
SMAD-SOTAMD_D-1.0 30 day(s)
SMAD-SOTAMD_P&S-1.0 30 day(s)
SMAD-SOTAMD_PM_D-1.0 30 day(s)
SMAD-SOTAMD_UC_P&S-1.0 30 day(s)

Performance Evaluation

For each algorithm, bona fide (evaluating bona fide face images) and morph (evaluating morphed face images) attempts are performed to compute Bona fide Presentation Classification Error Rate (BPCER) and Attack Presentation Classification Error Rate (APCER). As defined in [4] the BPCER is the percentage of bona fide presentations falsely classified as morphing presentation attacks while the APCER is the proportion of morphing attack presentations falsely classified as bona fide presentations.

Although it is possible to reject images, this is strongly discouraged. In fact, rejection is fused with other error rates for measuring the performance indicators; in particular, each rejection will be considered as a detection of a morphed image (isMorph="TRUE" and score=1).

For each algorithm the following performance indicators are reported:

  • EER (detection Equal-Error-Rate: the error rate for which both BPCER and APCER are identical)
  • BPCER10 (the lowest BPCER for APCER10%)
  • BPCER20 (the lowest BPCER for APCER5%)
  • BPCER100 (the lowest BPCER for APCER1%)
  • REJNBFRA (Number of bona fide face images that cannot be processed)
  • REJNMRA (Number of morphed face images that cannot be processed)
  • Average detection time
  • Maximum amount of memory allocated
  • Bona fide and Morph dectection score distributions
  • APCER(t)/BPCER(t) curves, where t is the detection threshold
  • DET(t) curve (the plot of BPCER against APCER)

Terms and Conditions

All publications and works that cite SMAD Benchmark Area must reference [3].

Bibliography

[1] M. Ferrara, A. Franco and D. Maltoni, "On the Effects of Image Alterations on Face Recognition Accuracy", in Thirimachos Bourlai, Face Recognition Across the Electromagnetic Spectrum, Springer, 2016.
[2] M. Ferrara, A. Franco and D. Maltoni, "Face Demorphing", IEEE Transactions on Information Forensics and Security, vol.13, no.4, pp.1008-1017, April 2018.
[3] K. Raja, M. Ferrara, A. Franco, L. Spreeuwers, I. Batskos, F. de Wit, M. Gomez-Barrero, U. Scherhag, D. Fischer, S. Venkatesh, J. M. Singh, G. Li, L. Bergeron, S. Isadskiy, R. Ramachandra, C. Rathgeb, D. Frings, U. Seidel, F. Knopjes, R. Veldhuis, D. Maltoni, C. Busch, "Morphing Attack Detection -- Database, Evaluation Platform and Benchmarking", arXiv 2006.06458, June 2020 (Submitted to IEEE-TIFS).
[4] ISO/IEC JTC1 SC37 Biometrics, ISO/IEC IS 30107-3:2017, IT – Biometric presentation attack detection – Part 3: Testing and Reporting, 2017.


For information or suggestions: fvcongoing@csr.unibo.it Copyright © 2020 Biometric System Laboratory