Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Differential Privacy Temporal Map Challenge

Help public safety agencies share data while protecting privacy.

Department of Commerce - National Institute of Standards and Technology, Public Safety Communications Research (PSCR) Division

Total Cash Prizes Offered: Up to $276,000
Type of Challenge: Ideas, Analytics, visualizations, algorithms
Partner Agencies | Non-federal: DrivenData, HeroX, Knexus Research
Submission Start: 10/01/2020 08:00 AM ET
Submission End: 05/17/2021 07:59 PM ET


Challenge Launch planned for October 1, 2020!

The Public Safety Communications Research Division (PSCR) of the National Institute of Standards and Technology (NIST) invites members of the public to join the Differential Privacy Temporal Map Challenge. This challenge seeks new tools with which to push the boundaries of current technologies for de-identifying data sets relevant to public safety.

The Differential Privacy Temporal Map Challenge is a series of contests, with prize awards up to $276,000 for excellence in formal data privacy. The contests will include a metrics development contest in the form of a white paper, a series of algorithm sprints that will explore new methods in differential privacy, and a contest designed to improve the usability of the participants’ source code, open to participants of the algorithm sprints.  There are no fees needed to enter any stage of the Challenge. Teams can participate in either or both the metric contest and the algorithm contest. 

You can make a difference!  Continue reading to learn about the challenge contests and other details.  


Large data sets containing personally identifiable information (PII) are exceptionally valuable resources for research and policy analysis in a host of fields such as emergency planning and epidemiology. This project seeks to engage the public in developing algorithms to de-identify data sets containing PII such that the data sets remain valuable tools yet cannot compromise the privacy of individuals whose information is contained within the data set. 

Previous NIST PSCR differential privacy projects (NIST Differential Privacy Synthetic Data Challenge and The Unlinkable Data Challenge: Advancing Methods in Differential Privacy) demonstrated that crowdsourced challenges can make meaningful advancements in this difficult and complex field. Those previous contests raised awareness of the problem, brought in innovators from outside the privacy community, and demonstrated the value of head-to-head competitions for driving progress in data privacy.  This Differential Privacy Temporal Map Challenge hopes to build on these results by extending the reach and utility of differential privacy algorithms to new data types.

Temporal map data is of particular interest to the public safety community. Yet the ability to track a person’s location over a period of time presents particularly serious privacy concerns. The Differential Privacy Temporal Map Challenge invites participants to develop algorithms that preserve data utility while guaranteeing privacy.

Differential privacy is a standard that protects privacy no matter what third-party data is available.  It does so by strictly limiting what is possible to learn about any individual in the data set.

Project Objectives

The purpose of this Challenge is to advance differential privacy technologies by building and measuring accuracy of algorithms that de-identify data sets containing time and spatial information with provable formal privacy. Public safety agencies collect extensive data containing time, geographic, and potentially personally identifiable information. These data can be an invaluable tool for policy makers, researchers, and the public in general. However, the tools do not yet exist to de-identify these data sets and preserve their utility while guaranteeing the records cannot be used to re-link to individuals. Thus, PSCR is inviting the public to explore new computational methods to de-identify these data sets and assess the quality of the output data. Some of the key features and capabilities for de-identification algorithms sought in this challenge include:

  • Output data that satisfies formal differential privacy.
  • Preserving the characteristics of original data sets as much as possible and, in particular, preserving sequential data characteristics and geographic characteristics.
  • Robust in their ability to process a wide variety of temporal and spatial data.

NIST PSCR also invites participants to develop metrics that best assess the accuracy of the data output by the algorithms that de-identify temporal map data. In particular, methods are sought that:

  • Measure the quality of data with respect to temporal or spatial accuracy, or both. 
  • Evaluate data quality in situations beyond this challenge.
  • Are straightforward to implement.

NIST PSCR seeks to incentivize participants to develop their algorithms into tools useful to public safety agencies and to be of benefit to public safety in general. To that end, NIST PSCR will consider solutions that are made open and accessible to the public for an Open Source Award. In the interest of refining and improving usability of the algorithms, NIST PSCR will invite algorithm participants to create a Development Plan that describes how participants will improve their code and make it more usable. Finally, Development Execution Awards will be considered for participants who execute their Development Plans. 

Summary of Important Dates

Timeline for Metric Paper Contest

Timeline for Metric Paper Contest

Timeline for Algorithm Contest

Timeline for Algorithm Contest

Timeline for Open Source and Development Contest

Timeline for Open Source and Development Contest


Prize Breakdown


Please review the Official Challenge Rules document.

Judging Criteria

A. Metric Contest. The Judges will evaluate submitted papers describing proposed metrics based on a balance of clarity, utility, and robustness.

B. Algorithm Contest. Submissions will be assessed based on, a) their ability to prove they satisfy differential privacy, and b) the accuracy of output data as compared with ground truth as assessed by a scoring function that will be released at the opening of each sprint. 

  • Progressive Prizes will also be awarded part-way through each sprint to the best performing algorithms according to the scoring function, with precedence given to algorithms that are pre-screened as satisfying differential privacy.

C. Open Source and Development Contest. Only solutions that are validated by the Judges as satisfying differential privacy in one or more of the sprints are eligible to compete in this portion of the Challenge.

  • Open Source Prizes are awarded to the top algorithms that are released to an open source repository.
  • Development Plan Prizes are made based on the quality of the Development Plan submitted for the participants’ highest scoring algorithm.
  • Development Execution Prizes are based on how well teams execute their improvements, as described in their Development Plan. 

How to Enter

For the Algorithm Contest, visit the Challenge on the DrivenData website to view details.

For the Metric Paper Contest, visit the Challenge on the HeroX website to view details.

For the Open Source and Development Contests, see the Official Rules for details on entering.

Point of Contact

Have feedback or questions about this challenge? Send the challenge manager an email