follow us on facebook follow us on twitter email us

Face Recognition Prize Challenge

About the Challenge
Do you have the most accurate unconstrained face recognition algorithm?

Posted By: Intelligence Advanced Research Project Activity
Category: Software/Apps
Skill: Software/Apps Interest: Science & Research Partnership With: National Institute of Standards and Technology
Submission Dates: 12 a.m. ET, Apr 21, 2017 - 2 p.m. ET, Jun 15, 2017 Judging Dates: Sep 01, 2017 - Sep 30, 2017 Winners Announced: Oct 31, 2017

Have you developed software to identity faces in general web photographs?  Can your software verify that a face in one photograph is the same as in another?  The Intelligence Advanced Research Projects Activity (IARPA), within the Office of the Director of National Intelligence (ODNI), announces the launch of the Face Recognition Prize Challenge (FRPC). The challenge aims to improve biometric face recognition by improving core face recognition accuracy.


Who We Are: IARPA focuses on high-risk, high-payoff research. The Face Recognition Prize Challenge will improve recognition of face images acquired without capture constraints (i.e., unconstrained images or images in the “wild”).

What We’re Doing: The goal of the Face Recognition Prize Challenge is to improve core face recognition accuracy and expand the breadth of capture conditions and environments suitable for successful face recognition.  The Challenge comes in two parts:  1) Face identification involves executing one-to-many search to return the correct entry from a gallery, if any; 2) Face verification requires the algorithm to match two faces of the same person while correctly rejecting faces of different persons.  Both tasks involve “non-cooperative” images where subjects were unaware of the camera or, at least, did not engage with, or pose for, the camera.

Why We’re Doing This:  Face recognition is hard.  Algorithms are known to commit both false negative and false positive errors, especially when factors such as head pose, illumination, and facial expression depart from formal portrait photograph standards.   IARPA is also aware that enormous research has been conducted in recent years with the advent of various deep neural network technologies.  IARPA is interested to know whether this rich vein of research has produced advancements in face recognition accuracy.

Where and When We’re Doing This:  Registration to join the challenge takes place through this Challenge.gov website.  From here participants are directed to register with the National Institute of Standards and Technology (NIST) FRPC Support Website.  Registration closes on June 15, 2017.

  • When does the FRPC launch?  April 21, 2017
  • Where to learn about the challenge, including rules, criteria, and eligibility requirements?  FRPC rules document
  • When does the registration and submission period close?  June 15, 2017
  • Where do participants register?  NIST FRPC Support Website
  • When do the judges meet to determine winners?  September, 2017
  • When will the winners be announced?  October, 2017

The challenge proceeds with developers sending pre-compiled software libraries to NIST, who is the designated test laboratory for the FRPC.  At NIST the algorithms will be run on sequestered images.  This means the FRPC is not an “open-book’’ or “take-home’’ test so neither the test images, nor any training images will be made available to developers.

Who should participate:  The FRPC is intended for prize participants who are eligible to compete for the challenge prizes.  IARPA encourages developers of automated face recognition algorithms to participate, both domestic and international, from academia and industry.  Other U. S. Government Agencies, Federally Funded Research and Development Centers (FFRDCs), University Affiliated Research Centers (UARCs), or any other similar organizations that have a special relationship with the Government, that gives them access to privileged or proprietary information, or access to Government equipment or real property, will not be eligible to participate in the prize challenge.  Entities affiliated with the IARPA Janus program are ineligible to participate.

Read the full rules and challenge eligibility document for the FRPC by downloading this document.

Why Participate: The developers of the most accurate algorithms will be eligible to win cash prizes from a total prize purse of $50,000. Prizes will be distributed for the following criteria:

  • One-to-many identification accuracy:  $25,000
  • One-to-many identification speed: $5,000
  • One-to-one verification accuracy:  $20,000

Related Links:

For general questions: frpc@iarpa.gov
For technical questions: frpc@nist.gov

Judges
Chris Boehnen Ph.D.
Senior Program Manager / IARPA
Patricia Wolfhope
Program Manager / DHS S&T
Patrick Grother
Staff Scientist / NIST
Judging Criteria

Search Accuracy Prize (i.e., Challenge IDENT Primary Prize)

See the FRPC Rules document for full details on the evaluation. A summary of the prize judging is as follows:

The Primary Prize winner will be declared by considering measurements of identification accuracy. This will be stated as the False Negative Identification Rate (FNIR) measured at the lowest scalar threshold that gives a fixed False Positive Identification Rate (FPIR) no higher than 10^-3.
— FNIR will be measured over many mate searches. FNIR is defined as the proportion of mate searches for which a correct mate is not returned above a threshold, T. Mate searches are those for which the person in the search image has a face image in the enrolled dataset.
— FPIR will measured over many non-mate searches. FPIR is defined as the proportion of non-mate searches that yield one or more non-mates at or above threshold, T. Non-mate searches are those which the person in the search image does not have a face image in the enrolled dataset.

The conduct of both mate and non-mate searches defines an open-set, or open-universe, problem. In the case of a tie in identification accuracy between two participants, the algorithm with the lowest median search duration will be declared the Primary Prize winner.

Search Speed Prize (i.e., Challenge IDENT Secondary Prize)

See the FRPC Rules document for full details on the evaluation. A summary of the prize judging is as follows:

The Secondary Prize will be awarded to the algorithm that a) has FNIR no larger than twice that of the Primary Prize award winner, and b) executes one-to-many template searches with the shortest duration. If the Primary Prize winner’s algorithm is the fastest, the both prizes will be awarded to the same participant.

Verification Accuracy Prize (i.e., Challenge VERIF Prize)

See the FRPC Rules document for full details on the evaluation. A summary of the prize judging is as follows:

The winner will be declared by considering measurements of verification accuracy. This will be stated as the False Non-Match Rate (FNMR) measured at the lowest scalar threshold that gives a fixed False Match Rate (FMR) no higher than 10^-3.
— FNMR will be measured over many genuine comparison. FNMR is defined as the proportion of genuine comparisons that yield a similarity score below a threshold, T.
— FMR will measured over many impostor comparisons. FMR is defined as the proportion of impostor comparisons that yield a similarity score at or above threshold, T.

In the case of a tie in verification accuracy between two participants, the algorithm with the lowest median template generation duration will be declared the winner.

How to Enter

[STEP 1] Participants read the FRPC rules:  Participants read the rules in full.  The FRPC rules document can be found here.

[STEP 2] Participants send documentation to NIST:  As specified in the FRPC Rules, participants must send a completed and signed Participation Agreement to NIST.  The participation agreement form is available at the NIST FRPC Support Website.

[STEP 3] Participants send software to NIST:  As specified in the FRPC Rules, participants must send algorithms to the National Institute of Standards and Technology (NIST) as a compiled library.  Software is to be compatible with the NIST Concept of Operations and API Specification.  Source code and other intellectual property must not be submitted to NIST.

[STEP 4] Software is run on sequestered images:  NIST runs the algorithm on images that are not made available to developers.  This creates a repeatable and fair test.  It impedes gaming strategies.  NIST will link submitted libraries to their test harness which is used in three steps:

  • Validation: NIST confirms it can reproduce participant-provided outputs on a small common set of images, provided by NIST.
  • Timing: NIST confirms that the implementation meets limits on computation duration.
  • Evaluation: NIST runs the algorithm on the test images sequestered at NIST.

[STEP 5] NIST computes performance:  Performance refers to accuracy and speed, and their dependence on quantities such as enrolled population size, image properties, and subject demographics.

[STEP 6] NIST delivers performance report to IARPA and judging commences.

Prizes
Search Accuracy Prize $25,000.00 The Search Accuracy Prize (i.e., Challenge IDENT Primary Prize) of $25,000 is awarded to the most accurate search algorithm.
Search Speed Prize $5,000.00 The Search Speed Prize (i.e., Challenge IDENT Secondary Prize) of $5,000 is awarded to the fastest search algorithm.
Verification Prize $20,000.00 The Verification Prize (i.e., Challenge VERIF Prize) of $20,000 is awarded to the most accurate verification algorithm.

Add to the Discussion

Solutions
No solutions have been posted for this challenge yet.
Rules

Read the full rules and challenge eligibility document for the Face Recognition Prize Challenge (FRPC) by downloading the PDF below:

FRPC Rules

The full NIST Concept of Operations and API Specifications can be found here.

Submit Solution
Challenge Followers
Public Profile: 1
Private Profile: 5