Differential Privacy Temporal Map Challenge: Sprint 1 (Prescreened Arena)
CALLING PRESCREENED PARTICIPANTS! Help public safety agencies share data while protecting privacy. If you haven't been prescreened yet, head on over to the Open Arena to learn more and get started. Competition hosted by NIST PSCR. #privacy
CALLING PRESCREENED PARTICIPANTS! Help public safety agencies share data while protecting privacy. If you haven't been prescreened yet, head on over to the Open Arena to learn more and get started.
No. of Entries
Minutemen1st Place Team
The goal of this challenge is to develop algorithms that preserve data utility as much as possible while guaranteeing individual privacy is protected. The challenge features a series of coding sprints to apply differential privacy methods to temporal map data, where each record is tied to a location and each individual may contribute to a sequence of events.
Large data sets containing personally identifiable information (PII) are exceptionally valuable resources for research and policy analysis in a host of fields supporting America's First Responders such as emergency planning and epidemiology.
Temporal map data is of particular interest to the public safety community in applications such as optimizing response time and personnel placement, natural disaster response, epidemic tracking, demographic data and civic planning. Yet, the ability to track a person's location over a period of time presents particularly serious privacy concerns.
Sprint 1 featured data on 911 calls in Baltimore made over the course of a year. Participants needed to build de-identification algorithms for generating privatized datasets that reported monthly incident counts for each type of incident by neighborhood.
The temporal sequence aspect of the problem is especially challenging because it allows one individual to contribute to many events (up to 20). This increases the sensitivity of the problem and the amount of added noise needed.
Many techniques from DP literature are not designed to handle high sensitivity. To overcome this, winning competitors tried different creative approaches:
- Subsampling: Only use a maximum of k records from each person, to reduce sensitivity to k.
- Preprocessing: Subsampling (only use a maximum of k records from each person, to reduce sensitivity to k), and reducing the data space by eliminating infrequent codes.
- Post-processing: Clean up noisy data by applying various optimizing, smoothing and denoising strategies (several clever approaches were used, see solution descriptions in the post below).
These solutions were evaluated using a "Pie Chart Evaluation Metric", designed to measure how faithfully each privatization algorithm preserves the most significant patterns in the data within each map/time segment. The first place winner combined several techniques and tailored their algorithm to the level of privacy required to ultimately achieve the greatest utility score from the privatized data.
Differential Privacy Temporal Map Challenge: Sprint 1 (Prescreened Arena): Rules and Terms of Data Use
The document below is a copy of the rules for the Differential Privacy Temporal Map Challenge provided for ease of reference. The definitive, official rules can be found at http://www.challenge.gov/challenge/differential-privacy-temporal-map-challenge/.
1. Official Rules
This document outlines the official rules for the Differential Privacy Temporal Map Challenge. Nothing within this document or in any supporting documents shall be construed as obligating the Department of Commerce, National Institute of Standards and Technology (NIST) or any other Federal agency or instrumentality to any expenditure of appropriated funds, or any obligation or expenditure of funds in excess of or in advance of available appropriations.
1.1. Summary of Challenge
The following is a summary of each contest. For more information, please review the full terms and conditions for each contest as provided throughout this document.
Metric Paper Contest
The Metric Paper Contest invites all eligible participants to submit a concept paper detailing metrics with which to assess the quality of outputs from algorithms that de-identify data sets containing temporal map data.
Public safety use-cases of temporal map data include emergency planning, epidemiologic analysis, and policy setting. High-quality data is required to perform sound analyses in these areas. Both time and space segments may be sparsely populated yet critically important. Further, these sparsely populated segments have inherently greater risk to linkage attack. Although differential privacy has a formal guarantee against linkage attack, there is no guarantee of accuracy in output data. In this contest, NIST Public Safety Communications Research Division (PSCR) seeks novel metrics by which to assess the quality of differentially private algorithms on temporal map data and thus to better evaluate data outputs.
Submitted concept papers should provide robust metrics that can be applied to a wide variety of data sets involving temporal and spatial data. Participants are encouraged to provide examples of how their proposed metrics will improve use case outcomes.
The purpose of this contest is to advance differential privacy technologies by building algorithms that de-identify data sets containing time and spatial information with provable formal privacy. Public safety agencies collect extensive data containing time, geographic, and potentially personally identifiable information. These data sets can be an invaluable tool for policy makers, researchers, and the public in general. However, the tools do not yet exist to de-identify these data sets and preserve their utility while guaranteeing the records cannot be used to re-link to individuals. Thus, NIST PSCR is inviting the public to explore new computational methods to de-identify these data sets. Some of the key features and capabilities for de-identification algorithms sought in this challenge include:
- Output data that satisfies formal differential privacy.
- Preservation of the characteristics of original data sets as much as possible and, in particular, preservation of sequential data characteristics and geographic characteristics.
- Robust ability to process a wide variety of temporal and spatial data.
The Algorithm Contest will occur in three ‘sprints’, where a sprint is a timeframe in which software is developed, and participants compete for prizes. Submissions will be assessed based on a) their ability to prove they satisfy differential privacy, and b) the accuracy of output data as compared with ground truth as assessed by a scoring function that will be released at the opening of each sprint. Progressive Prizes will also be awarded part-way through each sprint to the best performing algorithms according to the scoring function, with precedence given to algorithms that are pre-screened as satisfying differential privacy.
Open Source and Development Contests
Open Source Contest
In addition to expanding the types of data that can be made differentially private, this Challenge seeks to have participants disseminate the software developed during this contest to the public. To that end, participants are incentivized to release their solution in an open source repository. Top participants in the algorithm sprints may opt-in to compete in this contest. Participation in the Open Source Contest is not mandatory to participate in the other Contests. Participants in the Open Source Contest will be asked to increase the public knowledge base of differential privacy and public safety research by contributing to or developing in an open source repository. Algorithms, supporting documentation, software development plans, and differentially private data set contributions that can support public safety research and development will be reviewed and qualitatively evaluated by a panel of differential privacy Subject Matter Experts. Final selection will be made by NIST-appointed judges. Participants whose solutions are ‘Validated as Differentially Private’ as meeting differential privacy and are also developed in a manner allowing deposit in an open source repository will be eligible to receive a $4,000 Open Source Prize.
Development Plan and Development Execution Contests
Solutions ‘Validated as Differentially Private’ during the Algorithm Contest are also invited to submit a software Development Plan which describes how they will improve the quality and robustness of their code, improve documentation, and demonstrate the application of their solution to public safety data sets. A winning Development Plan will receive a prize of $1,000. Winners of the Development Plan Contest are invited to participate in the Development Execution Contest and will have 90 days to execute their plan. The execution of their plan will be assessed after 90 days and be eligible to receive up to $5,000 as a Development Execution Prize.
1.2. Prizes and Awards
Metric Paper Prizes (prize purse of $29,000)
Technical Merit Prize (up to $25,000 total)
Winners are selected by the Judges, based on evaluation of submissions against the Judging Criteria. Up to $25,000 will be awarded. Submissions that have similar scores may be given the same prize award with up to 10 winners total.
1st prize: Up to 2 winners of $5,000 each
2nd prize: Up to 2 winners of $3,000 each
3rd prize: Up to 3 winners of $2,000 each
4th prize: Up to 3 winners of $1,000 each
People’s Choice Prize (up to $4,000 total)
Winners are selected by public voting on submitted metrics that have been pre-vetted by NIST PSCR for compliance with minimum performance criteria. Up to a total of $4,000 will be awarded to up to four winners.
People’s Choice: 4 @ $1,000
Algorithm Prizes (prize purse of $147,000)
Sprint 1 (up to $29,000 total)
1st Place: $10,000
2nd Place: $7,000
3rd Place: $5,000
4th Place: $2,000
5th Place: $1,000
progressive prize: 4 @ $1,000
Sprint 2 (up to $39,000 total)
1st Place: $15,000
2nd Place: $10,000
3rd Place: $5,000
4th Place: $3,000
5th Place: $2,000
progressive prize: 4 @ $1,000
Sprint 3 (up to $79,000 total)
1st Place: $25,000
2nd Place: $20,000
3rd Place: $15,000
4th Place: $10,000
5th Place: $5,000
progressive prize: 4 @ $1,000
Open Source and Development Prizes (prize purse of $100,000)
Open Source Prize (up to $40,000 total)
Open Source: $4,000 (up to 10 each)
Development Plan Prize (up to $10,000 total)
Development Plan: $1,000 (up to 10 each)
Development Execution Prize (up to $50,000 total)
Up to 10 Development Execution Prizes will be awarded based on performance ranking
Total Prize Purse for Differential Privacy Temporal Map Challenge: $276,000
1.3. Summary of Important Dates
Timeline for Metric Paper Contest
|Preregistration||August 24, 2020|
|Open to submissions||October 1, 2020 - January 5, 2021|
|NIST PSCR Compliance check (for public voting)||January 5-6, 2021|
|Public voting||January 7-21, 2021|
|Judging and Evaluation||January 5 - February 2, 2021|
|Winners Announced||February 4, 2021|
Timeline for Algorithm Contest
|Preregistration||August 24, 2020|
|Sprint #1 - Participation||October 1 - November 15, 2020|
|Sprint #1 - Evaluation||November 15 - December 11, 2020|
|Sprint #1 - Winners announced||January 5, 2021|
|Sprint #2 - Participation||January 6 - February 22, 2021|
|Sprint #2 - Evaluation||February 22 - March 22, 2021|
|Sprint #2 - Winners announced||March 23, 2021|
|Sprint #3 - Participation||March 29 - May 17, 2021|
|Sprint #3 - Evaluation||May 17 - June 15, 2021|
|Sprint #3 - Winners announced||June 16, 2021|
Timeline for Open Source and Development Contest
|Open Source Deposit - Submissions due||June 30, 2021|
|Development Plan - Submissions due||June 30, 2021|
|Development Plan - Evaluation||July 1-10, 2021|
|Development Plan - Winners announced||July 14, 2021|
|Development Execution - Submissions due||October 9, 2021|
|Development Execution - Evaluation||October 9 - 23, 2021|
|Development Execution - Winners announced||October 27, 2021|
NOTE: NIST reserves the right to revise the dates at any time.
1.4. Rules for the Metric Paper Contest
NIST PSCR is interested in creative and novel approaches to evaluating the outputs from differential privacy algorithms, especially those involving temporal map data. The area of data privatization is growing rapidly, as is our understanding of the quality of privatized data. This Metric Paper Contest is implemented by HeroX (https://www.herox.com/bettermeterstick) on behalf of NIST PSCR.
NIST PSCR will provide all participants with four data sets:
- A ground truth data set,
- A privatized data set of poor quality,
- A privatized data set of moderate quality
- A supplementary data set with demographic characteristics of map segments.
As you propose your evaluation metrics, be prepared to explain their relevance and how they would be used. These metrics may be your original content, based on existing work, or any combination thereof. If your proposed metrics are based on existing work or techniques, please provide citations. Participants will be required to submit both a broad overview of proposed approaches and specific details about the metric definition and usage. Additionally, NIST PSCR is interested in how easily an approach can accommodate large data sets (scalability) and how well it can translate to different use cases (generalizability).
Metric Paper Contest Requirements
Metrics may be oriented towards map data, temporal sequence data, or combined temporal map data. Successful submissions to the Metric Paper Contest will include:
- Submission Title,
- A brief description of the proposed metric, (Note that this will be included with the title when identifying the metric during the public voting stage)
- An introduction of the participant or submitting team that includes a brief background with expertise, and optional explanation of the participant’s interest in the problem,
- Executive Summary
A summary of one to two pages of main ideas.
- A high-level explanation of the proposed metric, reasoning and rationale for the metric
- An example use case
- Metric Definition
A written definition of the metric, including English explanation and pseudocode that has been clearly written and annotated with comments, including:
- Any technical background information needed to understand the metric.
- Explanation of parameters and configurations.
- Walk-through examples of metric use.
- (optional) Computer code implementing the metric or portions of it.
- Metric Defense
A description of the rationale behind the metric, including:
- A description of the metric’s tuning properties that control the focus, breadth, and rigor of evaluation.
- A description of the discriminative power of the proposed metric: how well it identifies points of disparity between the ground truth and privatized data.
- A description of the coverage properties of the proposed metric: how well it abstracts/covers a breadth of uses for the data.
- A discussion of the feasibility of implementing the proposed metric.
- Examples of very different data applications where the metric can be used.
- Executive Summary A summary of one to two pages of main ideas.
Evaluation Criteria and Judging for Technical Merit Prize
Submissions to the Metric Paper Contest will undergo initial filtering to ensure they meet minimum criteria before they are reviewed and evaluated by members of the expert judging panel. These minimum criteria include:
- Participant meets eligibility requirements,
- All required sections of the submission form are completed,
- Proposed metric is coherently presented and plausible.
Submissions that have passed the initial filtering step will be reviewed for Technical Merit by members of the expert judging panel, evaluated against the evaluation criteria listed below, and scored based on the relative weightings shown.
- Clarity (30/100 points)
- Metric explanation is clear and well written, defines jargon and does not assume any specific area of expertise. Pseudocode is clearly defined and easily understood.
- Participant clearly addresses whether the proposed metric provides snapshot evaluation (quickly computable summary score) and/or deep dive evaluation (reports locating significant points of disparity between the real and synthetic data distributions) and explains how to apply it.
- The participant responds to all submission requirements thoroughly, including clear guidance on metric limitations.
- Utility (40/100 points)
- The metric effectively distinguishes between real and synthetic data.
- The metric represents a breadth of use cases for the data.
- Motivating examples are clearly explained and fit the abstract problem definition.
- Metric is innovative, unique, and likely to lead to greater, future improvements compared with other proposed metrics.
- Robustness (30/100 points)
- Metric is feasible to use for large volume use cases.
- The metric has flexible parameters that control the focus, breadth, and rigor of evaluation.
- The proposed metric handles the provided classes and types of data well and is relevant in many data different applications that fit the abstract problem definition.
Reviewers’ scores will be aggregated for each submission. The specific scores will not be released publicly or provided to the submitting participants. The submissions will be ranked based on their score and receive prize awards of 1st Prize (up to two), 2nd Prize (up to two), 3rd Prize (up to three), or 4th Prize (up to three). Submissions that have similar scores may be given the same prize award with up to 10 winners total. NIST PSCR may choose not to use one or more level of prizes depending on the quality and number of submissions.
Evaluation Criteria and Judging for People’s Choice Prize
All submissions to the Metric Paper Contest will be considered for a People’s Choice Prize through public voting. Submissions that pass initial filtering will proceed to a public voting stage and will be made viewable to the public at the start of the public voting period. Voting will occur on the HeroX platform (https://www.herox.com/bettermeterstick) where submissions will be available for review. Each registered user on the HeroX platform will be able to make a single vote. Registered users will be asked to vote for the solution they find has the highest likelihood of being useful. Winners will be determined based on the number of votes received by each submission during the voting period.
Participants may win both a Technical Merit prize and a People’s Choice prize. Any participant determined to be using unfair methods to solicit votes will be automatically disqualified from the challenge. Unfair vote getting practices include, but are not limited to, vote buying and automated vote generation (e.g., bots). Up to 4 People’s Choice Prizes of $1,000 each will be awarded.
1.5. Rules of Algorithm Contest
Temporal map data is of particular interest to the public safety community. Yet the ability to track a person’s location over a period of time presents particularly serious privacy concerns. The Differential Privacy Temporal Map Algorithm Contest invites participants to develop algorithms that preserve data utility while guaranteeing privacy.
Algorithm Contest Participation
The Algorithm Contest submission form and specific dates are located on the DrivenData Competition Website (https://deid.drivendata.org/).
If you meet the eligibility requirements and would like to participate, then you must first complete the registration process through the Competition Website within the Competition Period. After you complete the registration process, you will receive access to the available Competition Data set(s) (each a “Data set”) (described on the Competition Website) that will enable you to develop and submit one or more Submissions. All Submissions must be received during the Competition Period. To register, visit the Competition Website and follow the onscreen instructions to complete and submit your registration. All of the registration information that you provide is collectively referred to as your "Account". (If you have already created an Account at https://www.drivendata.org/, enter your username and password and follow the on-screen instructions).
After you register individually, you may join a group of individuals with which to collaborate (each group, a "Team"), but you may register only once for the contest either as an Individual or as part of a Team. You may only compete using a single, unique DrivenData account. Competing using more than one DrivenData account per individual is a breach of these rules and DrivenData and/or NIST PSCR reserve the right to disqualify any individual (or Team including an individual) who is found to breach these rules. In any given sprint, an individual participant cannot join more than one Team and a participant who is part of a Team cannot also enter the contest on an individual basis.
Algorithm Contest Requirements
This contest will be divided into a sequence of three sprints. Each sprint will introduce new constraints to the scoring function described later.
Each sprint will include a Development Phase followed by Final Scoring.
A. Development Phase Process
Upon registration for the challenge, at the start of each sprint the participants will receive access to a training ground truth data set and data dictionary. For purposes of privacy, this data set is considered previously released publicly available data that can be used for algorithm design and testing without impact to the privacy loss budget. These data will have the same feature schema (but not necessarily the same distribution) as the testing data sets used for final scoring.
For each sprint, the value defined for delta and epsilon will vary and participants will submit their synthetic data sets at specified values of epsilon. A similarity scoring metric will be used to score the participant solutions with different epsilon values as publicized on the Competition Website for each sprint. Participants with algorithms that produce high-quality privatized synthetic data across the different values of epsilon and that better preserve spatial and time sequential data will receive higher scores.
Differential Privacy Pre-screen Process: All participants will be required to submit a complete written explanation of their algorithm, any additional data sources used other than the provided data set(s), and a clear, correct mathematical proof that their solution satisfies differential privacy. This document will be reviewed and validated by NIST staff or their delegates. Participants will receive “Pre-screened” status if their written explanation proves that they have an essentially correct understanding of differential privacy as applied to their submission, or a brief explanation why their algorithm or proof is incorrect. Participants who have had their approach Pre-screened will be indicated on the leaderboard.
- Code must be provided as often as requested by NIST staff or their delegates who will review for non-intentional mistakes. This is a courtesy to ensure the participant still qualifies for prize eligibility.
- Pre-screen status will be used in determining the leaderboard position for entries. Pre-screened participants will also have access to a test harness where they can submit containerized code to test the functioning of their code submission on the data schema and receive a score. This will be the same test harness that will be used to run submissions during the Final Scoring Process.
- Only Pre-screened approaches are eligible for Final Scoring.
B. Final Scoring Process
Final scoring will be performed on data set(s) that have not been previously shared with the challenge participants.
Before awarding cash prizes to participants who place in prize-winning ranks at the end of each algorithm sprint, these participants must submit a report containing their complete algorithm description and mathematical privacy proof, along with their source code, and a code guide readme file which gives the mapping between the steps of the algorithm write-up and the analogous steps in the source code. They must also submit their code to the containerized test harness on DrivenData to ensure that the code runs according to provided specifications, so that it can be used in the Final Scoring Process. Participants will have 7 days notification to provide this report and final code submission.
The report will be subject to a final validation review to verify that their algorithm correctly satisfies differential privacy at the stated values of delta and epsilon, and that the code implements the stated algorithm. The review will be performed by NIST staff or their delegates, which may include NIST certified Subject Matter Experts from outside NIST. NIST staff or their delegates may contact participants for clarifications. Algorithms that pass this verification process will be ‘Validated as Differentially Private’.
Once Validated as Differentially Private, the algorithm will be executed on a sequestered data set with the same schema as the Development Phase data. The similarity scoring metric released at the beginning of the sprint will be used to score the participant solutions with different epsilon values as publicized on the Competition Website for each sprint.
If a participant is placed in a prize winning rank but fails to provide the report, or the review determines that their solution does not satisfy differential privacy, then they will not receive a prize, and it will be awarded to the participant with the next best performance who successfully completes the final review. A tie between two or more valid and identically ranked submissions will be resolved in favor of the tied submission that was submitted first.
C. Additional Requirements
Each submission must be uploaded to the Competition Website during the Competition Period. Initial submissions will consist of data sets processed through de-identification algorithms, which will be evaluated according to the scoring function shared on the Competition Website. Participants can also submit materials for pre-screening as described in the Development Phase Process above. Once participants are Pre-screened, they will also be able to submit containerized code to the test harness for running and evaluating on the provided data set. Final submissions will include reports and containerized code as described in the Final Scoring Process above. The number of submissions a participant may submit during each calendar day of the Competition Period will be displayed on the Competition Website.
In order to be eligible to win a prize, participants must submit all source code used in their solution for review. If the solution includes licensed software (e.g. open source software), participants must include the full license agreements with the submission. Include licenses in a folder labeled “Licenses”. Within the same folder, include a text file labeled “README” that explains the purpose of each licensed software package as it is used in your solution. DrivenData/NIST are not required to contact submitters for additional instructions if the code does not run. If we are unable to run your solution due to license problems, including any requirement to download a license, your submission might be rejected. Be sure to contact us right away if you have concerns about this requirement.
Participating using more than one DrivenData account is deemed cheating and, if discovered, will result in disqualification from the Competition and any other affected Competitions and may result in banning or deactivation of affected DrivenData accounts. DrivenData reserves the right to request information associated with our investigation of suspected cheating. Failure to respond to these requests (including failure to furnish the requested information) within 10 days is grounds for disqualification.
During a sprint, participants are prohibited from privately sharing source or executable code developed in connection with or based upon the Data, and any such sharing is a breach of these Competition Rules and may result in disqualification.
External data sets and pre-trained models are allowed for use in the competition provided the following are satisfied:
- the external data and pre-trained models are freely and publicly available to all participants under a permissive open source license;
- the data does not have the same schema or source as the data set(s) provided in the contest; and
- their source and usage are defined in the algorithm description, and they have been approved during the differential privacy Pre-screening Process.
Algorithm Contest Evaluation Criteria and Judging
NIST staff or their delegates will review entries based on the Pre-screening Process and Final Scoring Process described above. A submission that fails to meet the compliance criteria will be ineligible to win prizes in this contest.
The scoring function used for scoring and ranking submissions will be released at the beginning of each sprint and be displayed on the Competition Website. Submissions that pass the final validation and compliance review will be given final scores based on the scoring function. Final Scoring will occur during a sequestered evaluation where the submitter’s code will be run against withheld data and similarity metrics will be computed between analytics results on the synthetic privatized set and the original data set. Awards will be based on Final Scoring.
Winners who are found to be ineligible for cash prizes may still be publicly recognized based on their placement on the leaderboard. The prize award, normally allotted to the placement of the ineligible team, will be given to the next ‘Validated’ team on the leaderboard. Throughout the challenge, the challenge’s online leaderboard will display rankings and accomplishments, giving them various opportunities to have their work viewed and appreciated by stakeholders from industry, government and academic communities.
Challenge leaders will have the opportunity to win progressive prizes approximately halfway through each sprint and prior to the final round. Progressive prizes are awarded based on the participant’s placement on the provisional leaderboard at an exact time (approximately half-way through the challenge), and their provisional testing score (not the sequestered results). These prizes are awarded to the top four participants on the leaderboard. There are four progressive prizes available for each algorithm sprint at $1,000 each.
1.6. Rules of the Open Source and Development Contests
Open Source Contest Rules
The Open Source Contest is open to Participants in the Algorithm Contest. The 10 teams scoring highest on the leaderboard at the end of the third sprint in the Algorithm Contest, that have been ‘Validated as Differentially Private’, and have developed their submission in a manner allowing deposit in an open source repository, are eligible for an Open Source Prize. In the event there are more than 10 eligible submissions, prizes will be awarded based on the leaderboard score. If there are not enough qualifying submissions from the third sprint, submissions from the second sprint and first sprint will be considered for an award. Within 20 days of the end of the third sprint, NIST PSCR will notify all eligible first and second sprint participants if they are invited to submit to the Open Source Contest. Only participants deemed eligible for cash prizes will be considered for an Open Source award.
Up to ten $4,000 Open Source prizes will made to eligible teams who demonstrate they have deposited their solutions to an open source repository within 14 days of award announcements for the Algorithm Contest’s third sprint. Any participants invited from the first and second sprint to participate will have 14 days from their invitation to demonstrate their deposit. To participate, send an email to firstname.lastname@example.org with the following information:
- “DeID2 Open Source Submission” as the subject line.
- The name of your submission in the Algorithm Contest.
- A URL to open source repository, which contains:
- Full solution source code,
- Complete documentation for the code,
- A complete written explanation of their algorithm,
- Any additional data sources used other than the provided data set(s),
- A correct mathematical proof that their solution satisfies differential privacy,
- An open source license.
Final awards and cash prizes will be subject to verification that the submission has been posted in full.
Development Plan Contest Rules
Teams who have been ‘Validated as Differentially Private’ in any sprint of the Algorithm Contest are eligible to participate in the Development Plan Contest. Teams are given the opportunity to create a plan in which they will further develop their software to increase its utility and usability. Participants must also describe how their code will be utilized for the benefit of public safety following their development.
- Participants will submit a Development Plan of no more than 4 pages (as a PDF document) that describes goals that are: specific, measurable, attainable, relevant, and accomplishable within 90 days. The plan should contain milestones with dates describing the progression of accomplishing the goals that the team intends to achieve within 90 days of application.
- Plans must be submitted within 14 days of final algorithm awards.
- The Development Plan must include the details of the software engineering standards that the team plans to achieve with citations to the relevant standards. The NIST team will propose a list of desired standards.
- The Development Plan must include details on how their solution will benefit the public safety community after development. For example, will it be developed into a commercial product? Will it be deposited in an open source repository?
- Plans will identify novel public safety data sets they plan to de-identify using their algorithm to include with their documentation. Alternatively, teams can engage public safety agencies with whom they will work to select data to include in their final documentation.
Developmental Plan Evaluation and Judging
Criterion 1: Scope (20 / 100 points):
- Appropriateness of proposed goals to code and documentation
- Likelihood of accomplishing proposed goals within 90 days
Criterion 2: Standards (20 / 100 points):
- Identification of appropriate standards to achieve
- Plan for implementing identified standards
Criterion 3: Utility (30 / 100 points):
- Extent that proposed goals will increase code usability and robustness
- Likelihood that proposed goals will increase code usability and robustness
Criterion 4: Public safety benefit (30 / 100 points):
- The value of the identified public data set to be de-identified
- The likelihood of the project benefiting the public safety community
Development Plan Prizes and Awards
Submissions will be ranked according to their score and up to 10 scoring submissions will receive development prizes of $1,000 each.
Development Execution Contest Rules
Participants who receive a Development Plan Award are eligible to participate in the Development Execution Contest. In order to be considered, participants must submit for review:
- Full source code and documentation for their software.
- A complete written explanation of their algorithm, and a clear, correct mathematical proof that their solution satisfies differential privacy.
- A code guide readme file which gives the mapping between the steps of the algorithm write-up and the analogous steps in the source code.
- A set of de-identified public safety data identified in the plan.
- A 3-page or less explanation of how their submission has satisfied their development plan.
- A statement of 1-page or less of how their solution will benefit public safety.
Development Execution Evaluation and Judging
NIST staff or their delegates will review submission materials and ensure that the solution still satisfies differential privacy. Only solutions validated to satisfy differential privacy will be awarded prizes. On the weight of the entire submission the Judges will award one of the following performance rankings:
- ‘Excellent’: The participant has met all or nearly all goals as stated in their development plan and has exceeded their stated goals in a way that has significant positive impact on their code usability. The solution is markedly more usable, better documented, and meets specified standards. The solution has a high potential to benefit public safety.
- ‘Good’: The participant has met all or nearly all goals as stated in their development plan. The solution is more usable, better documented, and approaches specified standards. The solution is likely to benefit public safety.
- ‘Satisfactory’: The participant has made significant progress on their development plan and works toward specified standards. Usability and documentation are improved. The solution may benefit public safety.
- ‘Fair’: The participant has made some progress on their stated goals and worked toward specified standards. Usability or documentation are improved. The solution may benefit public safety.
- ‘Incomplete’: The participant has not made significant progress toward meeting their goals or their solution no longer satisfies differential privacy.
Development Execution Contest Prizes and Awards
A performance ranking of ‘Excellent’ will receive $5,000, a rank of ‘Good’ will receive $4,000, a rank of ‘Satisfactory’ will receive $2,000, and a rank of ‘Fair’ will receive $500. Teams receiving a rank of ‘Incomplete’ will not receive a Development Execution prize. A maximum of 10 prizes will be awarded in this contest based on the submissions’ performance rankings with a maximum combined payout of $50,000 in prizes.
2. General Submission Requirements for All Contests
In order for submissions to be eligible for review, recognition and award, participants must meet the following requirements:
- Deadline - The submission must be available for evaluation by the end date noted in these rules.
- No NIST logo - submission(s) must not use NIST's logo or official seal and must not claim NIST endorsement.
- Each submission must be original, the work of the participant, and must not infringe, misappropriate or otherwise violate any intellectual property rights, privacy rights, or any other rights of any person or entity.
- It is an express condition of submission and eligibility that each participant warrants and represents that the participant's submission is solely owned by the participant, that the submission is wholly original with the participant, and that no other party has any ownership rights or ownership interest in the submission. The participant must disclose if they are subject to any obligation to assign intellectual property rights to parties other than the participant, if the participant is licensing or, through any other legal instrument, utilizing intellectual property of another party.
- Each participant further represents and warrants to NIST that the submission, and any use thereof by NIST shall not: (i) be defamatory or libelous in any manner toward any person, (ii) constitute or result in any misappropriation or other violation of any person's publicity rights or right of privacy, or (iii) infringe, misappropriate or otherwise violate any intellectual property rights, privacy rights or any other rights of any person or entity.
- Each submission must be in English.
- Submissions will not be accepted if they contain any matter that, in the sole discretion of NIST, is indecent, obscene, defamatory, libelous, in bad taste, or demonstrates a lack of respect for public morals or conduct, promotes discrimination in any form, or which adversely affects the reputation of NIST. NIST shall have the right to remove any content from the contest websites Event Website in its sole discretion at any time and for any reason, including, but not limited to, any online comment or posting related to the Challenge.
- If NIST, in its discretion, finds any submission to be unacceptable, then such submission shall be deemed disqualified.
3. Judging Panel
The submissions will be judged by a qualified panel of expert(s) selected by the Director of NIST. The panel consists of Department of Commerce, National Institute of Standards and Technology and non- Department of Commerce, National Institute of Standards and Technology experts who will judge the submissions according to the judging criteria identified above in order to select winners. Judges will not (A) have personal or financial interests in, or be an employee, officer, director, or agent of any entity that is a registered participant in a contest; or (B) have a familial or financial relationship with an individual who is a registered participant.
The decisions of the Judging panel for the contest will be announced in accordance with the dates noted in these rules. NIST PSCR will not make participants’ evaluation results from the Judging panel available to participants or the public.
4. Terms and Conditions
4.1. Verification of Winners
ALL CONTEST WINNERS WILL BE SUBJECT TO VERIFICATION OF IDENTITY, QUALIFICATIONS AND ROLE IN THE CREATION OF THE SUBMISSION BY THE DEPARTMENT OF COMMERCE, NATIONAL INSTITUTE OF STANDARDS AND TECHNOLOGY.
Participants must comply with all terms and conditions of the Official Rules. Winning a prize is contingent upon fulfilling all requirements contained herein. The potential winners will be notified by email, telephone, or mail after the date of winning results. Each potential winner of a monetary or non-monetary prize will be required to sign and return to the Department of Commerce, National Institute of Standards and Technology, within ten (10) calendar days of the date the notice is sent, an ACH Vendor/Miscellaneous Enrollment Form (OMB NO. 1510-0056) and a Contestant Eligibility Verification form in order to claim the prize.
In the sole discretion of the Department of Commerce, National Institute of Standards and Technology, a potential winner will be deemed ineligible to win if: (i) the person/entity cannot be contacted; (ii) the person/entity fails to sign and return an ACH Vendor/Miscellaneous Enrollment Form (OMB NO. 1510-0056) and a Contestant Eligibility Verification form within the required time period; (iii) the prize or prize notification is returned as undeliverable; or (iv) the submission or person/entity is disqualified for any other reason. In the event that a potential or announced winner is found to be ineligible or is disqualified for any reason, the Department of Commerce, National Institute of Standards and Technology, in their sole discretion, may award the prize to another Participant.
4.2. Eligibility Requirements
All participants 18 years or older are invited to register to participate except for individuals from entities or countries sanctioned by the United States Government.
A Participant (whether an individual, team, or legal entity) must have registered to participate in order to be an eligible Participant.
Cash prizes are restricted to eligible Participants who have complied with all of the requirements under section 3719 of title 15, United States Code as contained herein. At the time of entry, the Official Representative (individual or team lead, in the case of a group project) must be age 18 or older and a U.S. citizen or permanent resident of the United States or its territories. In the case of a private entity, the business shall be incorporated in and maintain a place of business in the United States or its territories.
Employees, contractors, directors and officers (including their spouses, parents, and/or children) of HeroX and DrivenData, Inc. and each of their respective parent companies, subsidiaries and affiliated companies, distributors, web design, advertising, fulfillment, judging and agencies involved in the administration, development, fulfillment and execution of this Challenge will not be eligible to compete in this Challenge.
Participants may not be a Federal entity or Federal employee acting within the scope of their employment. Current and former NIST PSCR Federal employees or Associates are not eligible to compete in a prize challenge within one year from their exit date. Individuals currently receiving PSCR funding through a grant or cooperative agreement are eligible to compete but may not utilize the previous NIST funding for competing in this challenge. Previous and current PSCR prize challenge participants are eligible to compete. Non-NIST Federal employees acting in their personal capacities should consult with their respective agency ethics officials to determine whether their participation in this competition is permissible. A Participant shall not be deemed ineligible because the Participant consulted with Federal employees or used Federal facilities in preparing its entry to the Challenge if the Federal employees and facilities are made available to all Participants on an equitable basis.
Participants, including individuals and private entities, must not have been convicted of a felony criminal violation under any Federal law within the preceding 24 months and must not have any unpaid Federal tax liability that has been assessed, for which all judicial and administrative remedies have been exhausted or have lapsed, and that is not being paid in a timely manner pursuant to an agreement with the authority responsible for collecting the tax liability. Participants must not be suspended, debarred, or otherwise excluded from doing business with the Federal Government.
Multiple individuals and/or legal entities may collaborate as a group to submit a single entry and a single individual from the group must be designated as an Official Representative for each entry. That designated individual will be responsible for meeting all entry and evaluation requirements.
Challenge submissions can be from an individual, a team or a group of teams who submit a solution to the Challenge. If a team of individuals, a corporation, or an organization is selected as a prize winner, NIST will award a single dollar amount to the Official Representative. The Official Representative is solely responsible for allocating any prize amount among its member Participants as they deem appropriate. NIST will not arbitrate, intervene, advise on, or resolve any matters between entrant members. It will be up to the winning team(s) to reallocate the prize money among its member Participants, if they deem it appropriate.
4.4. Submission Rights
Any applicable intellectual property rights to a submission will remain with the Participant. By participating in the competition, the Participant is not granting any rights in any patents, pending patent applications, or copyrights related to the technology described in the entry. However, by submitting a challenge submission, the Participant is granting the Department of Commerce, National Institute of Standards and Technology, the National Aeronautics and Space Administration (NASA), and any parties acting on their behalf certain limited rights as set forth herein.
- The Participant grants to the Department of Commerce, National Institute of Standards and Technology, NASA, and any parties acting on their behalf the right to review the submission, to describe the submission in any materials created in connection with this competition, and to screen and evaluate the submission, and to have the Judges, Challenge administrators, and the designees of any of them, review the submission. The Department of Commerce, National Institute of Standards and Technology, NASA, any parties acting on their behalf, and any Challenge Co-Sponsors, will also have the right to publicize Participant’s name and, as applicable, the names of Participant’s team members and/or organization which participated in the submission following the conclusion of the competition.
- As part of its submission, the participant must provide written consent granting the Department of Commerce, National Institute of Standards and Technology, NASA, and any parties acting on their behalf, a royalty-free, non-exclusive, irrevocable, worldwide license to display publicly and use for promotional purposes the participant’s entry (“demonstration license”). This demonstration license includes posting or linking to the participant’s entry on the Department of Commerce, National Institute of Standards and Technology and NASA websites, including the competition website and partner websites, and inclusion of the participant’s submission in any other media, worldwide.
By submitting an entry, each Participant represents and warrants that the Participant is the sole author and copyright owner of the submission; that the submission is an original work of the Participant and that the Participant has acquired sufficient rights to use and to authorize others, including the Department of Commerce, National Institute of Standards and Technology, NASA, and any other parties acting on their behalf, to use the submission, as specified throughout the Official Rules, that the submission does not infringe upon any copyright or upon any other third party rights of which the Participant is aware; and that the submission is free of malware.
By submitting an entry, the Participant represents and warrants that all information submitted is true and complete to the best of the Participant’s knowledge, that the Participant has the right and authority to submit the entry on the Participant’s own behalf or on behalf of the persons and entities that the Participant specifies within the entry, and that the entry (both the information and materials submitted in the entry and the underlying technology/method/idea/treatment protocol/solution described in the entry):
- is the Participant’s own original work, or is submitted by permission with full and proper credit given within the entry;
- does not contain proprietary or confidential information or trade secrets (the Participant’s or anyone else’s);
- does not knowingly violate or infringe upon the patent rights, industrial design rights, copyrights, trademarks, rights in technical data, rights of privacy, publicity or other intellectual property or other rights of any person or entity;
- does not contain malicious code, such as viruses, malware, timebombs, cancelbots, worms, Trojan horses or other potentially harmful programs or other material or information;
- does not and will not violate any applicable law, statute, ordinance, rule or regulation, including, without limitation, United States export laws and regulations, including but not limited to, the International Traffic in Arms Regulations and the Department of Commerce Export Regulations; and
- does not trigger any reporting or royalty or other obligation to any third party.
By making a submission to this prize competition, each Participant agrees that no part of its submission includes any trade secret information, ideas or products, including but not limited to information, ideas or products within the scope of the Trade Secrets Act, 18 U.S.C. § 1905. All submissions to this prize competition are deemed non-proprietary. Since NIST does not wish to receive or hold any submitted materials “in confidence” it is agreed that, with respect to the Participant’s entry, no confidential or fiduciary relationship or obligation of secrecy is established between NIST and the Participant, the Participant’s team, or the company or institution the Participant represents when submitting an entry, or any other person or entity associated with any part of the Participant’s entry.
4.6. Additional Terms and Conditions
This document outlines the Official Rules for the 2020 Differential Privacy Temporal Map Challenge. Nothing within this document or in any documents supporting the 2020 Differential Privacy Temporal Map Challenge shall be construed as obligating the Department of Commerce, NIST or any other Federal agency or instrumentality to any expenditure of appropriated funds, or any obligation or expenditure of funds in excess of or in advance of available appropriations.
4.7. Challenge Subject to Applicable Law
All challenge phases are subject to all applicable federal laws and regulations. Participation constitutes each Participant's full and unconditional agreement to these Official Rules and administrative decisions, which are final and binding in all matters related to the challenge. Eligibility for a prize award is contingent upon fulfilling all requirements set forth herein. This notice is not an obligation of funds; the final award of prizes is contingent upon the availability of appropriations.
Participation is subject to all U.S. federal, state and local laws and regulations. Participants are responsible for checking applicable laws and regulations in their jurisdiction(s) before participating in the prize competition to ensure that their participation is legal. The Department of Commerce, National Institute of Standards and Technology shall not, by virtue of conducting this prize competition, be responsible for compliance by Participants in the prize competition with Federal Law including licensing, export control, and nonproliferation laws, and related regulations. Individuals entering on behalf of or representing a company, institution or other legal entity are responsible for confirming that their entry does not violate any policies of that company, institution or legal entity.
4.8. Resolution of Disputes
The Department of Commerce, National Institute of Standards and Technology is solely responsible for administrative decisions, which are final and binding in all matters related to the challenge.
In the event of a dispute as to any registration, the authorized account holder of the email address used to register will be deemed to be the Participant. The "authorized account holder" is the natural person or legal entity assigned an email address by an Internet access provider, online service provider or other organization responsible for assigning email addresses for the domain associated with the submitted address. Participants and potential winners may be required to show proof of being the authorized account holder.
The winners of these prizes (collectively, "Winners") will be featured on the Department of Commerce, National Institute of Standards and Technology, NASA, and any other parties acting on their behalf’s websites, newsletters, social media, and other outreach materials.
Except where prohibited, participation in the Challenge constitutes each winner's consent to the Department of Commerce, National Institute of Standards and Technology's, its agents', and any Challenge Co-Sponsors’ use of each winner's name, likeness, photograph, voice, opinions, and/or hometown and state information for promotional purposes through any form of media, worldwide, without further permission, payment or consideration.
The prize competition winners will be paid prizes directly from the Department of Commerce, National Institute of Standards and Technology. Prior to payment, winners will be required to verify eligibility. The verification process with the agency includes providing the full legal name, tax identification number or social security number, routing number and banking account to which the prize money can be deposited directly.
All cash prizes awarded to Participants by the Department of Commerce, National Institute of Standards and Technology are subject to tax liabilities, and no withholding will be assessed by the Department of Commerce National Institute of Standards and Technology on behalf of the Participant claiming a cash prize.
4.11. Liability and Insurance
Any and all information provided by or obtained from the Federal Government is without any warranty or representation whatsoever, including but not limited to its suitability for any particular purpose. Upon registration, all Participants agree to assume and, thereby, have assumed any and all risks of injury or loss in connection with or in any way arising from participation in this challenge, development of any application or the use of any application by the Participants or any third-party. Upon registration, except in the case of willful misconduct, all Participants agree to and, thereby, do waive and release any and all claims or causes of action against the Federal Government and its officers, employees and agents for any and all injury and damage of any nature whatsoever (whether existing or thereafter arising, whether direct, indirect, or consequential and whether foreseeable or not), arising from their participation in the challenge, whether the claim or cause of action arises under contract or tort. Upon registration, all Participants agree to and, thereby, shall indemnify and hold harmless the Federal Government and its officers, employees and agents for any and all injury and damage of any nature whatsoever (whether existing or thereafter arising, whether direct, indirect, or consequential and whether foreseeable or not), including but not limited to any damage that may result from a virus, malware, etc., to Government computer systems or data, or to the systems or data of end-users of the software and/or application(s) which results, in whole or in part, from the fault, negligence, or wrongful act or omission of the Participants or Participants' officers, employees or agents.
Participants are not required to obtain liability insurance for this Challenge.
4.12. Records Retention and FOIA
All materials submitted to the Department of Commerce, National Institute of Standards and Technology as part of a submission become official records and cannot be returned. Any confidential commercial information contained in a submission should be designated at the time of submission. Submitters will be notified of any Freedom of Information Act requests for their submissions in accordance with 29 C.F.R. § 70.26.
4.13. Privacy Advisory
The HeroX.com and the DrivenData, Inc. websites are hosted by private entities and are not services of NIST. The solicitation and collection of your personal or individually identifiable information is subject to the hosts’ privacy and security policies and will not be shared with NIST unless you win the Challenge. Challenge winners’ personally identifiable information must be made available to NIST in order to collect an award.
4.14. 508 Compliance
Participants should keep in mind that the Department of Commerce, National Institute of Standards and Technology considers universal accessibility to information a priority for all individuals, including individuals with disabilities. The Department is strongly committed to meeting its compliance obligations under Section 508 of the Rehabilitation Act of 1973, as amended, to ensure the accessibility of its programs and activities to individuals with disabilities. This obligation includes acquiring accessible electronic and information technology. When evaluating submissions for this challenge, the extent to which a submission complies with the requirements for accessible technology required by Section 508 will be considered.
4.15. General Conditions
This prize competition shall be performed in accordance with the America COMPETES Reauthorization Act of 2010, Pub. Law 111-358, title I, § 105(a), Jan. 4, 2011, codified at 15 U.S.C. § 3719 and amended by the American Innovation and Competitiveness Act of 2016 (Pub. L. No. 114-329) (hereinafter “America COMPETES Act”).
The Department of Commerce, National Institute of Standards and Technology reserves the right to cancel, suspend, and/or modify the challenge, or any part of it, if any fraud, technical failures, or any other factor beyond the Department of Commerce, National Institute of Standards and Technology's reasonable control impairs the integrity or proper functioning of the challenge, as determined by the Department of Commerce, National Institute of Standards and Technology in its sole discretion. The Department of Commerce, National Institute of Standards and Technology is not responsible for, nor is it required to count, incomplete, late, misdirected, damaged, unlawful, or illicit submissions, including those secured through payment or achieved through automated means.
NIST reserves the right in its sole discretion to extend or modify the dates of the Challenge, and to change the terms set forth herein governing any phases taking place after the effective date of any such change. By entering, you agree to the terms set forth herein and to all decisions of NIST and/or all of their respective agents, which are final and binding in all respects.
ALL DECISIONS BY the Department of Commerce, National Institute of Standards and Technology ARE FINAL AND BINDING IN ALL MATTERS RELATED TO THE CHALLENGE.