How to enter
Dates: Submissions will be accepted starting April 28, 2017. The submission period for Phase I will end on August 4, 2017. The Phase II (Prototype Implementation) submission period will begin October 2, 2017 and end January 15, 2018. The grand prize finalist is anticipated to be announced in March of 2018.
Information on the Behavioral Risk Factor Surveillance System can be found at www.cdc.gov/brfss.
Participants may enter by visiting healthdatachallenge.gov and challenge.gov and following the instructions for submission. The U.S. and Canadian challenges are being run in parallel and US entrants should submit to this contest via challenge.gov and Canadian citizens to the Canadian contest found at healthdatachallenge.gov.
Ideation Period: The Challenge will launch as an ideation/open submission period in which eligible participants (outlined in Eligibility Rules) may register and submit an entry onto the Challenge Website (https://www.challenge.gov/challenge/the-healthy-behavior-data-challenge/). Information about the Challenge and a link to the Challenge website can also be found at Challenge.gov. The 13—week ideation period will be followed by a 16-week resubmission period held for those who were chosen by the judges as semifinalists to further refine their idea. The Challenge website serves as the destination and submission portal. Participants may find the Challenge rules, eligibility criteria, evaluation criteria, additional resources, and the Challenge timeline on the Challenge website or at Challenge.gov.
Submission Requirements: Entries not in compliance with the submission requirements outlined below will be ineligible for further review and prize award. During the open submission period, participants must submit the following information to enter the HBD Challenge:
Phase I (Prototype Development)
Submit/upload a completed HBD Challenge Submission Template describing the proposed project, project personnel and data sources.
Provide a PowerPoint or other visual presentation of the proposed project including purpose, methods and anticipated outcomes of the proposed approach, which could be used to present the proposal to a judging panel.
Provide a description of data that are anticipated to be captured by the proposed approach, and, if applicable, descriptions of online app(s), web-based tools or communication devices used to recruit or track subjects’ healthy behavior information.
Propose (a) viable data source(s) from currently available or a feasible future source (such as a proposed app or online tool). HBD Challenge participants may propose the use of public and/or private data sources, as long as respondent confidentiality and privacy are maintained.
Demonstrate how CDC would be able to access the data.
Outline in a detailed manner what information will be obtained.
Demonstrate how data will be extracted and collected: present the format in which it will be stored.
Show how the new data source(s) could be linked with other data sources, in a statistically robust manner that could result in useful public health insights, citing statistical approaches and evidence to support the proposal.
Focus on one or more behavioral factors including physical activity, sleep, sedentary behaviors, and/or nutrition.
Provide information about the population reached and generalizability of the approach.
Describe how data could be stratified by demographic characteristics (e.g. age, sex, education, geographic jurisdiction).
Show how information gathered addresses some or all of the following common metrics in one or more of the healthy behavior topics below:
Hours of sleep per night (sleep duration)
Amount of time awake (sleep quality)
Number of times awake (sleep quality)
Number of adults reporting having trouble getting to and staying asleep
Time to fall asleep
Amount of time in REM vs. non-REM sleep (duration of sleep stage)
Sleep behaviors such as snoring, sleep talking, sleep movement
B. Sedentary Behaviors:
Average number of hours per day spent sedentary, excluding sleep time
Average number of hours per day spent on a computer/screen including watching TV, videos, playing computer games, emailing or using the internet
Sedentary data with additional information on location (work, school, community, etc.) broken down by weekday and weekend day
Total calories consumed per day
Consumption of fruit (not including juices) by day, week, or month
Consumption of green leafy or lettuce salads, with or without other vegetables, by day, week, or month
Consumption of vegetables (not including lettuce salads and potatoes) by day, week, or month
Number of sugar-sweetened beverages consumed by day, week, or month
Number of caffeinated drinks consumed by day, week, or month
D. Physical Activity
Minutes of moderate-to-vigorous physical activity (MVPA) per day (ideally by location - work, school, in community)
Daily number of steps
Miles/km (Distance) on foot
Number of days of physical activity/week or month (and established number of days in one month)
Minutes of moderate-to-vigorous physical activity (MVPA) per day (ideally by location - work, school, in community) broken down by week day and weekend day.
Type of activity (aerobic, strength, etc.)
Duration of exercise
Flights of stairs climbed
Average and peak heart rate
Occupational physical activity and active chores amount: (location of physical activity)
Number of hours of reported physical activities while at work, in or around household
Leisure time physical activity amount:
# of hours per week adult participants spent in sports, fitness or recreational physical activities, organized or non-organized, that lasted a minimum of 10 continuous minutes
Number of adults reporting and time spent walking or cycling to work or school
Participants may also choose to suggest additional metrics in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep. If additional metrics are included, the participant should include a short description of the data and how it might inform public health efforts.
Phase II (Prototype Implementation Phase)
During The Phase II Prototype Implementation Phase, the six submissions selected under Phase I will test their solutions, utilizing data from 300 or more adults (aged 18 and above) residing in the US or its territories. During this phase there will be an opportunity for HBD Challenge participants to incorporate data from existing surveys including the Behavioral Risk Factor Surveillance System (BRFSS).
Phase II (Prototype Implementation) allows applicants to test proposals developed in Phase I. The prototype is a demonstration of possible methods for supplementing data from existing surveillance systems (such as the BRFSS). This prototype is not meant to be merged with existing surveillance systems, but rather to complement data collected with current infrastructures. At the end of implementation HBD Challenge participants should be able to:
Compare data obtained by the prototype to data from the BRFSS in the areas of nutrition, physical activity, sedentary behaviors, and/or sleep.
Demonstrate how data from the included participants could be stratified by demographics (age, sex, education, etc.).
Demonstrate the ease of adding additional types of mobile applications and wearable devices to existing survey methodologies.
Report that describes the prototype/methodology and the prototype’s anticipated strengths and limitations for surveillance.
Demonstrate the applicability of the non-traditional data source(s) for ongoing public health surveillance purposes.
Describe the prototype in detail, including purpose, method, outcomes and comparability to data obtained from the Behavioral Risk Factor Surveillance System (BRFSS).
Provide a working prototype including data (in Excel format) obtained using the prototype from 300 or more adult respondents residing in the US or its territories. The data must include the age, gender, location, and at least one of the measures associated with the HBD Challenge in the areas of nutrition, physical activity, sedentary behaviors and/or sleep.
Provide a PowerPoint presentation to the judges and invited CDC personnel which includes information on the purpose, methods, outcomes and comparability to the BRFSS.
Phase I Scoring Criteria
All Criteria are scaled 1-5, with 1 being the lowest score on each dimension and 5 being the highest score on each dimension. Scores are weighted by the proportion of each dimension and then aggregated to create a final score.
Efficacy of Prototype (20%): Where 1 = Prototype is likely to not work in a way that is statistically appropriate; 5 = Prototype is likely to successfully collect, and harmonize data, in a statistically robust manner, across multiple data sources to address common metrics.
Promise of Comparability to BRFSS Findings (20%): Where 1 = Prototype does not consider stratification parameters, or applies to only a narrow population; 5 = prototype holds promise for capturing data that is valid, reliable, and representative of a large population.
Acceptability (15%): Where 1 = All parties expressed concerns with data being used in terms of respondent privacy, feasibility and utility; 5 = all parties involved are comfortable with data being used in terms of respondent privacy, feasibility and utility. NOTE: This means that federal and state restrictions on data collection and assurance of confidentiality are being respected (mandatory criteria; if not scored 5, prototype may be disqualified)
Innovation (15%): Where 1 = Prototype duplicates existing approach; 5 = prototype presents a novel approach.
Feasibility of Prototype (15%): Where 1 = Prototype is not feasible due to factors like cost, availability of data, etc. /5 = Prototype is feasible and addresses potential implementation challenges by offering solutions
Generalizability (10%): Where 1 = Prototype is not generalizable to a range of data sources; 5 = prototype is generalizable to a range of data sources.
Breadth of Data Collected (Scope) (5%): Where 1 = Prototype does not address required metrics, across the identified content area(s); 5 = Prototype includes required metrics.
Phase II Scoring Criteria
All Criteria are scaled 1-5, with 1 being the lowest score on each dimension and 5 being the highest score on each dimension. Scores are weighted by the proportion of each dimension and then aggregated to create a final score. Judging criteria for Phase II include:
Data quality (20%)
1= Prototype does not provide data that are likely to be valid or reliable or representative of a population; 5 = prototype provides data that demonstrate validity, reliability, and representativeness.
Ability to complement BRFSS Findings (20%)
1= Prototype does not outline steps to complement BRFSS efforts; 5 = prototype is provides data which can complement and/or supplement measures collected by the BRFSS or other publically available traditional surveillance systems.
Validation of or Enhancement of existing national public health surveillance data (20%)
1 = Prototype cannot be statistically aligned with currently available health data; 5 = prototype statistically aligns with available data across population sub-groups
1= Prototype does not demonstrate the ability to include additional types of data and data sources; 5= prototype demonstrates flexibility in the ability to add different data types and data from additional sources.
Simplicity (structure and ease of operation) (10%)
1= Prototype’s structure and operation is complex; 5 = Prototype’s structure is clear and easy to implement; it is not burdensome on current systems.
Resources for system operation (10%)
1 = Prototype requires heavy resource burden in terms of cost, training, administration, infrastructure; 5 = prototype has low resource burden in terms of cost, training, administration, infrastructure
1 = There is a significant gap in time between data collection and analysis/ 5 = there is a real-time monitoring through the collected data
Stratification by Demographics (5%)
1 = Prototype is unable to stratify the data by key demographics; 5 = prototype is able to stratify the data by age, sex, education, and race/ethnicity.