Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Informational Only

This challenge is no longer accepting new submissions.

The Healthy Behavior Data Challenge

The Healthy Behavior Data Challenge will harness this potential and identify feasible alternative options for collecting health-related behaviors in new ways.

Centers for Disease Control and Prevention

Type of Challenge: Ideas
Submission Start: 04/28/2017 12:05 PM ET
Submission End: 01/15/2018 12:00 AM ET


The Centers for Disease Control and Prevention (CDC) located within the Department of Health and Human Services (HHS) announces the launch of the Healthy Behavior Data Challenge.

The Healthy Behavior Data Challenge responds to the call for new ways to address the challenges and limitations of self-reported health surveillance information and tap into the potential of innovative data sources and alternative methodologies for public health surveillance.

The Healthy Behavior Data (HBD) Challenge will support the development and implementation of prototypes to use these novel methodologies and data sources (e.g., wearable devices, mobile applications, and/or social media) to enhance traditional healthy behaviors surveillance systems in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep among the adult population aged 18 years and older in the US and US territories.

The collection of health data through traditional surveillance modes including telephone and in-person interviewing, however, is becoming increasingly challenging and costly with declines in participation and changes in personal communications. In addition, the self-reported nature of responses particularly in the areas of nutrition, physical activity, sedentary behaviors, and sleep has been a major limitation in these surveillance systems, since self-reported data are subject to under/over reporting and recall bias. Meanwhile, the advent of new technologies and data sources including wearable devices ( such as: smart watches, activity trackers, sleep monitors, etc.), mobile health applications on smartphones or tablets, and data from social media represents an opportunity to enhance the ability to monitor health-related information and potentially adjust for methodological limitations in traditional self-reported data. 

The Healthy Behavior Data (HBD) Challenge will be conducted concurrently with a similar challenge proposed by the Public Health Agency of Canada.  This will enable the two countries to learn from their respective challenges and leverage information.  We expect increased efficiency with a dual challenge.

The Healthy Behavior Data Challenge participants will propose data sources and approaches for aggregating data from wearable devices, mobile applications and/or social media in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep. In Phase II (Prototype Implementation), a subset of submissions (up to 3) with promising concepts will be invited to test their proposed approaches for ongoing public health surveillance.


Winner: Catherine Bass

Solution: Onlife Health - Closing the Loop

Description: Augmenting Mobile Data Sources for Public Health Surveillance The system was designed to simplify and improve the quality and quantity of health behavior data collection by aggregating personal wearable device and mobile health app data, augmented by user validation and supplemented with survey responses. The mobile app was designed to collect and validate data from wearable devices and mobile apps and to collect survey responses on healthy behaviors. The data collected include nutrition, sleep, physical activity and sedentary behaviors. The project included a series of short, just-in-time survey questions delivered via push notification to supplement mobile data sources. The system cross-validated the collected healthy behavior data from the mobile and survey data sets, as well as with the BRFSS survey data. Robert Furberg - RTI International - Enhancing Public Health Surveillance Indicators using Wearable Devices The method was designed to augment any mode of data collection, from interviewer-administered in-person or telephone surveys to self-administration online or via mobile device, with minimal disruption to these essential data collection activities. The project demonstrated how the combination of self-generated data can enhance public health surveillance indicators to provide valid, measures of physical activity, sedentary behaviors, and sleep. The approach implemented a three-step data collection: screening, survey, and data retrieval. Owners of Fitbit devices were invited to participate in the HBDC. A service called Fitabase was used for data retrieval of the three measures.


How to Enter: Participants may enter by visiting and and following the instructions for submission. The U.S. and Canadian challenges are being run in parallel and US entrants should submit to this contest and non-US to the Canadian contest.

Eligibility Rules for Participating in the Competition: To be eligible to win a prize under this challenge, an individual or entity—

  1. Shall have registered to participate in the competition under the rules promulgated by the Centers for Disease Control and Prevention;
  2. Shall have complied with all the requirements under this section;
  3. In the case of a private entity, shall be incorporated in and maintain a primary place of business in the United States, and in the case of an individual, whether participating singly or in a group, shall be a citizen or permanent resident of the United States; and
  4. May not be a Federal entity or Federal employee acting within the scope of their employment.
  5. Shall not be an HHS employee working on their applications or submissions during assigned duty hours.
  6. Are an individual or team comprised of members each of who are 18 years of age or over.
  7. Are not on the Excluded Parties List System located at


    1. Federal grantees may not use Federal funds to develop challenge applications unless consistent with the purpose of their grant award. Federal contractors may not use Federal funds from a contract to develop challenge applications or to fund efforts in support of a challenge submission.
    2. Employees of CDC, and/or any other individual or entity associated with the development, evaluation, or administration of the Challenge as well as members of such persons’ immediate families (spouses, children, siblings, parents), and persons living in the same household as such persons, whether or not related, are not eligible to participate in the Challenge.
    3. An individual or entity shall not be deemed ineligible because the individual or entity used Federal facilities or consulted with Federal employees during a competition if the facilities and employees are made available to all individuals and entities participating in the competition on an equitable basis.
    4. Applicants must agree to assume any and all risks and waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from their participation in a competition, whether the injury, death, damage, or loss arises through negligence or otherwise.
    5. A solution may be disqualified if it fails to function as expressed in the description provided by the user, or if it provides inaccurate or incomplete information.

CDC reserves the right to disqualify participants from the Challenge for inappropriate, derogatory, defamatory, or threatening comments or communication through the Challenge website or on the website.

  1. Submissions must be free of security threats and/or malware. Applicants/Contestants agree that CDC may conduct testing on the product/submission to determine whether malware or other security threats may be present. CDC may disqualify the product if, in CDC’s judgment, the product may damage government or others’ equipment or operating environment.
  2. Applicants must obtain liability insurance or demonstrate financial responsibility in the amount of $0 for claims by: (1) a third party for death, bodily injury, or property damage, or loss resulting from an activity carried out in connection with participation in a competition, with the Federal Government named as an additional insured under the registered applicant’s insurance policy and registered applicant’s agreeing to indemnify the Federal Government against third party claims for damages arising from or related to competition activities; and (2) the Federal Government for damage or loss to Government property resulting from such an activity.  Applicants who are a group must obtain insurance or demonstrate financial responsibility for all members of the group.
  3. By participating in the Challenge, each Applicant agrees to comply with and abide by these Official Rules, Terms & Conditions and the decisions of the Federal Agency sponsors and/or the individual judges, which shall be final and binding in all respects.

Payment of the Prize: Prizes awarded under this competition will be paid by electronic funds transfer and may be subject to Federal income taxes. HHS will comply with the Internal Revenue Service withholding and reporting requirements, where applicable.

Basis upon Which Winner Will Be Selected:  A review panel composed of subject-matter experts will judge eligible HBD Challenge entries.  A judging panel will make final winner selections based upon the criteria outlined below and in compliance with the HHS Competition Judging Guidelines.

Judging Criteria

Phase I Scoring Criteria

All Criteria are scaled 1-5, with 1 being the lowest score on each dimension and 5 being the highest score on each dimension. Scores are weighted by the proportion of each dimension and then aggregated to create a final score.

  1. Efficacy of Prototype (20%): Where 1 = Prototype is likely to not work in a way that is statistically appropriate; 5 = Prototype is likely to successfully collect, and harmonize data, in a statistically robust manner, across multiple data sources to address common metrics.
  2. Promise of Comparability to BRFSS Findings (20%): Where 1 = Prototype does not consider stratification parameters, or applies to only a narrow population; 5 = prototype holds promise for capturing data that is valid, reliable, and representative of a large population.
  3. Acceptability (15%): Where 1 = All parties expressed concerns with data being used in terms of respondent privacy, feasibility and utility; 5 = all parties involved are comfortable with data being used in terms of respondent privacy, feasibility and utility. NOTE: This means that federal and state restrictions on data collection and assurance of confidentiality are being respected (mandatory criteria; if not scored 5, prototype may be disqualified)
  4. Innovation (15%): Where 1 = Prototype duplicates existing approach; 5 = prototype presents a novel approach.
  5. Feasibility of Prototype (15%): Where 1 = Prototype is not feasible due to factors like cost, availability of data, etc.; 5 = Prototype is feasible and addresses potential implementation challenges by offering solutions
  6. Generalizability (10%): Where 1 = Prototype is not generalizable to a range of data sources; 5 = prototype is generalizable to a range of data sources.
  7. Breadth of Data Collected (Scope) (5%): Where 1 = Prototype does not address required metrics, across the identified content area(s); 5 = Prototype includes required metrics.

Phase II Scoring Criteria

All Criteria are scaled 1-5, with 1 being the lowest score on each dimension and 5 being the highest score on each dimension. Scores are weighted by the proportion of each dimension and then aggregated to create a final score. Judging criteria for Phase II include:

  • Data quality (20%)

1= Prototype does not provide data that are likely to be valid or reliable or representative of a population; 5 = prototype provides data that demonstrate validity, reliability, and representativeness.

  • Ability to complement BRFSS Findings (20%)

1= Prototype does not outline steps to complement BRFSS efforts; 5 = prototype is provides data which can complement and/or supplement measures collected by the BRFSS or other publically available traditional surveillance systems.

  • Validation of or Enhancement of existing national public health surveillance data (20%)

1 = Prototype cannot be statistically aligned with currently available health data; 5 = prototype statistically aligns with available data across population sub-groups

  • Flexibility (10%)

1= Prototype does not demonstrate the ability to include additional types of data and data sources; 5= prototype demonstrates flexibility in the ability to add different data types and data from additional sources.

  • Simplicity (structure and ease of operation) (10%)

1= Prototype’s structure and operation is complex; 5 = Prototype’s structure is clear and easy to implement; it is not burdensome on current systems.

  • Resources for system operation (10%)

1 = Prototype requires heavy resource burden in terms of cost, training, administration, infrastructure; 5 = prototype has low resource burden in terms of cost, training, administration, infrastructure

  • Timeliness (5%)

1 = There is a significant gap in time between data collection and analysis/ 5 = there is a real-time monitoring through the collected data

  • Stratification by Demographics (5%)

1 = Prototype is unable to stratify the data by key demographics; 5 = prototype is able to stratify the data by age, sex, education, and race/ethnicity.

How To Enter

Dates: Submissions will be accepted starting April 28, 2017. The submission period for Phase I will end on August 4, 2017. The Phase II (Prototype Implementation) submission period will begin October 2, 2017 and end January 15, 2018. The grand prize finalist is anticipated to be announced in March of 2018.

Information on the Behavioral Risk Factor Surveillance System can be found at

Participants may enter by visiting and and following the instructions for submission. The U.S. and Canadian challenges are being run in parallel and US entrants should submit to this contest via and Canadian citizens to the Canadian contest found at

Ideation Period: The Challenge will launch as an ideation/open submission period in which eligible participants (outlined in Eligibility Rules) may register and submit an entry onto the Challenge Website ( Information about the Challenge and a link to the Challenge website can also be found at The 13-week ideation period will be followed by a 16-week resubmission period held for those who were chosen by the judges as semifinalists to further refine their idea. The Challenge website serves as the destination and submission portal. Participants may find the Challenge rules, eligibility criteria, evaluation criteria, additional resources, and the Challenge timeline on the Challenge website or at

Submission Requirements: Entries not in compliance with the submission requirements outlined below will be ineligible for further review and prize award. During the open submission period, participants must submit the following information to enter the HBD Challenge:

Phase I (Prototype Development)

  1. Submit/upload a completed HBD Challenge Submission Template describing the proposed project, project personnel and data sources.
  2. Provide a PowerPoint or other visual presentation of the proposed project including purpose, methods and anticipated outcomes of the proposed approach, which could be used to present the proposal to a judging panel.
  3. Provide a description of data that are anticipated to be captured by the proposed approach, and, if applicable, descriptions of online app(s), web-based tools or communication devices used to recruit or track subjects’ healthy behavior information.
  4. Propose (a) viable data source(s) from currently available or a feasible future source (such as a proposed app or online tool). HBD Challenge participants may propose the use of public and/or private data sources, as long as respondent confidentiality and privacy are maintained.
  5. Demonstrate how CDC would be able to access the data.
  6. Outline in a detailed manner what information will be obtained.
  7. Demonstrate how data will be extracted and collected: present the format in which it will be stored.
  8. Show how the new data source(s) could be linked with other data sources, in a statistically robust manner that could result in useful public health insights, citing statistical approaches and evidence to support the proposal.
  9. Focus on one or more behavioral factors including physical activity, sleep, sedentary behaviors, and/or nutrition.
  10. Provide information about the population reached and generalizability of the approach.
  11. Describe how data could be stratified by demographic characteristics (e.g. age, sex, education, geographic jurisdiction).
  12. Show how information gathered addresses some or all of the following common metrics in one or more of the healthy behavior topics below:

A. Sleep:

  1. Hours of sleep per night (sleep duration)
  2. Amount of time awake (sleep quality)
  3. Number of times awake (sleep quality)
  4. Number of adults reporting having trouble getting to and staying asleep
  5. Time to fall asleep
  6. Amount of time in REM vs. non-REM sleep (duration of sleep stage)
  7. Heart rate
  8. Respiration
  9. Sleep behaviors such as snoring, sleep talking, sleep movement

B. Sedentary Behaviors:

  1. Average number of hours per day spent sedentary, excluding sleep time
  2. Average number of hours per day spent on a computer/screen including watching TV, videos, playing computer games, emailing or using the internet
  3. Sedentary data with additional information on location (work, school, community, etc.) broken down by weekday and weekend day

C. Nutrition:

  1. Total calories consumed per day
  2. Consumption of fruit (not including juices) by day, week, or month
  3. Consumption of green leafy or lettuce salads, with or without other vegetables, by day, week, or month
  4. Consumption of vegetables (not including lettuce salads and potatoes) by day, week, or month
  5. Number of sugar-sweetened beverages consumed by day, week, or month
  6. Number of caffeinated drinks consumed by day, week, or month

D. Physical Activity

  1. Minutes of moderate-to-vigorous physical activity (MVPA) per day (ideally by location – work, school, in community)
  2. Daily number of steps
  3. Miles/km (Distance) on foot
  4. Number of days of physical activity/week or month (and established number of days in one month)
  5. Minutes of moderate-to-vigorous physical activity (MVPA) per day (ideally by location – work, school, in community) broken down by week day and weekend day.
  6. Calories burned
  7. Type of activity (aerobic, strength, etc.)
  8. Active minutes
  9. Duration of exercise
  10. Flights of stairs climbed
  11. Average and peak heart rate
  12. Occupational physical activity and active chores amount: (location of physical activity)
  13. Number of hours of reported physical activities while at work, in or around household
  14. Leisure time physical activity amount:
  15. # of hours per week adult participants spent in sports, fitness or recreational physical activities, organized or non-organized, that lasted a minimum of 10 continuous minutes
  16. Number of adults reporting and time spent walking or cycling to work or school

Participants may also choose to suggest additional metrics in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep. If additional metrics are included, the participant should include a short description of the data and how it might inform public health efforts.

Phase II (Prototype Implementation Phase)

During The Phase II Prototype Implementation Phase, the six submissions selected under Phase I will test their solutions, utilizing data from 300 or more adults (aged 18 and above) residing in the US or its territories. During this phase there will be an opportunity for HBD Challenge participants to incorporate data from existing surveys including the Behavioral Risk Factor Surveillance System (BRFSS).

Phase II (Prototype Implementation) allows applicants to test proposals developed in Phase I. The prototype is a demonstration of possible methods for supplementing data from existing surveillance systems (such as the BRFSS). This prototype is not meant to be merged with existing surveillance systems, but rather to complement data collected with current infrastructures. At the end of implementation HBD Challenge participants should be able to:

  1. Compare data obtained by the prototype to data from the BRFSS in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep.
  2. Demonstrate how data from the included participants could be stratified by demographics (age, sex, education, etc.).
  3. Demonstrate the ease of adding additional types of mobile applications and wearable devices to existing survey methodologies.
  4. Report that describes the prototype/methodology and the prototype’s anticipated strengths and limitations for surveillance.
  5. Demonstrate the applicability of the non-traditional data source(s) for ongoing public health surveillance purposes.
  6. Describe the prototype in detail, including purpose, method, outcomes and comparability to data obtained from the Behavioral Risk Factor Surveillance System (BRFSS).
  7. Provide a working prototype including data (in Excel format) obtained using the prototype from 300 or more adult respondents residing in the US or its territories. The data must include the age, gender, location, and at least one of the measures associated with the HBD Challenge in the areas of nutrition, physical activity, sedentary behaviors and/or sleep.
  8. Provide a PowerPoint presentation to the judges and invited CDC personnel which includes information on the purpose, methods, outcomes and comparability to the BRFSS.