Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

EcoTox TARGET Challenge

Develop high quality, low-cost tools that assess global gene expression in common aquatic toxicity test organisms

Environmental Protection Agency

Total Cash Prizes Offered: $300,000
Type of Challenge: Scientific
Partner Agencies | Federal: U.S. Army Engineer Research and Development Center
Partner Agencies | Non-federal: DoW, Environment and Climate Change Canada, European Commission Joint Research Centre, Syngenta
Submission Start: 03/19/2020 12:00 PM ET
Submission End: 06/15/2021 11:59 PM ET


This challenge calls for respondents to develop high quality, low cost, technologies/platforms for evaluating global gene expression in samples from four common aquatic toxicity test organisms: Pimephales promelas (a fish), Daphnia magna (a crustacean), Chironomous dilutus (an insect; formerly Chironomous tentans), and Raphidocelis subcapitata (a green algae). These represent species and associated trophic levels most frequently tested when evaluating the ecological hazards of chemicals. While there are many extant, viable, technologies for measuring global gene expression, the significant technological leap and challenge to the solver community is to provide these capabilities at a cost and scale of commercial throughput that can accommodate the analysis of thousands or tens of thousands of samples per year. A target price point is $50 per sample or less.

Key Dates:

  • Informational Webinar: 12:00 p.m. ET, February 18, 2020
  • Registration Closes: 11:59 p.m. ET, March 16, 2020
  • Kick-off Webinar for Registered Solvers: 12:00 p.m. ET, March 19, 2020
  • Submission Start: 12:00 p.m. ET, March 19, 2020
  • Submission End: 11:59 p.m. ET June 15, 2021


The prize for the challenge winner will be $300,000 US dollars. One prize will be awarded.


To be eligible to compete for the award, prospective solvers must register by no later than March 16, 2020 at:


  • Eligible: Individuals, or teams from private companies, academic institutions, non-governmental organizations, or independent research or technological institutes. The competition is open to both U.S. and foreign citizens/organizations.
  • Not eligible: U.S. or foreign government organizations.
  • Not eligible: Individuals involved in development of award selection criteria or reference sample generation.

For additional details and rules, visit:

Terms and Conditions

By registering, Solvers agree to the following terms of participation:

  • Solvers will not receive compensation for resources or time invested in addressing the challenge. Only the winning Solution will receive a cash award.
  • Only the top-ranked Solution will receive the award.
  • Solvers retain their rights to all intellectual property (e.g., details and design of their technology) that may be disclosed to the sponsors over the course of the challenge. Technical details and designs will not be disclosed or published without permission from the technical point of contact named in the registration.
  • Sponsors retain the right to disclose reference sample data, performance criteria, and other evaluation criteria summarized in the technology description template to provide a transparent reporting of how the winning solution was selected.
  • Sponsors retain the right to publish, present, and/or otherwise publicize results of the challenge competition that does not involve the disclosure of intellectual property of the Solver(s). Solvers will be afforded opportunity to review publications, presentation, or other publicity to protect against unwanted disclosure of intellectual property.
  • Solvers reserve the right to remove themselves from the competition at any time, up to final submission of results for evaluation, by notifying the sponsor in writing. The technical point of contact must make the request in writing on behalf of his/her team.
  • Registration for the challenge does not confer any obligation to deliver results. However, any solvers removing themselves from the competition prior to evaluation forfeit the rights to publish results obtained for the reference samples supplied for the competition unless they obtain written consent from the challenge sponsors.
  • Solvers that do not submit their results and technology description template by the submission deadline will be automatically removed from the competition and subject to the same terms as if they had forfeited in writing. The submission deadline may be extended at the discretion of the sponsors, but any extension will apply to all registered solvers.

Judging Criteria

Judging Panel

The judging panel will consist of six subject matter experts selected by the Challenge Sponsors.

Judging Criteria

Submissions will be judged based on data generated for a common set of blinded reference samples provided to all Solvers and an accompanying Technology Description Template.

Scoring Overview

Scoring will be based on the weighted (% of total score) criteria provided in the tables below. Each criterion is scored based on either a nominal or fractional scale of 0-5 with 0 being the lowest and 5 being the highest, or by pass/fail with 5 being pass and 0 being fail.

1. Quality and performance



Quality control: Does the platform contain a quality control system that addresses consistency within and between samples and is consistent with current standards used within various platforms for transcriptomic analyses?

o E.g., Microarray chip-based platforms should contain hybridization controls, redundant positional controls to evaluate edge effects, etc.

o E.g., RNA-seq platforms provide number of reads per sample, base quality score by cycle, nucleotide distribution by cycle, GC content, etc.

o Note – Scoring for this category will require subjectivity from the judging panel.

0 to 5


Data collection/extraction: Are the data collection/extraction methods and expression normalization/quantification methods described in adequate detail? Are they compatible with the ToxCast high throughput transcriptomics data analysis pipeline?

0 to 5


Precision: Precision will be determined by evaluating 1) coefficients of variation of gene expression values across unblinded technical duplicates, 2) correlation analysis of fold-change profiles between selected reference samples, resulting in a metric of concordance, and 3) clustering unblinded reference samples when analyzed together with all conditions. Results from these three analyses will be normalized and merged into a multiplier between 0-1 that will be used to determine the total score between 0-5.

multiplier x 5


Accuracy: Accuracy will be determined by evaluating 1) the percent concordance between fold-change values determined by Solver data compared to the fold-change values determined by pre-qualification. Results to determine the total score between 0-5.

0 to 5


Quantity of RNA: Was the quantity of reference RNA used per analysis tracked and reported (Y = points awarded; N=0 points)?

o Results generated using < 1 µg total RNA = 2 pts

o Results generated using < 0.25 µg total RNA = 1 pt

o Results generated using < 0.1 µg total RNA = 1 pt

o Results generated using < 0.01 µg total RNA = 1 pt

 0 to 5


2. Economic and commercial viability



Economic viability (i.e., cost per sample, including downstream data analysis cost):

o Is the cost of sample preparation and per sample supply and reagent costs for conducting the sample analysis and generating the data provided? If proprietary downstream data analysis software is required, include a per-sample adjustment to overall sample cost based on software license cost.

o Total per-sample cost is

  • i. $20 or less = 5 pts
  • ii. >$20-$30 = 4 pts.
  • iii. >$30-$50 = 3 pts.
  • iv. >$50-$75 = 2 pts.
  • v. >$75-$100 = 1 pt.
  • vi. >$100 = 0 pts.

0 to 5


Commercial viability and throughput capability: Is there a reasonable demonstration/description of how and when the Solver would be able to meet the potential throughput requirements of HTP sample generation?

0 to 5


3. Coverage



Approach and annotation:

o Is the approach taken for detection and quantification of transcript expression adequately described? (e.g., whether the platform employs a targeted or non-targeted analysis and the general means by which the platform detects and quantifies transcript presence and abundance)?

o Are annotation files provided with each platform/species that contain the required information and link to the data files?

0 to 5


Transcriptome coverage: What proportion of transcriptome coverage do the platforms have in relation to the pre-qualification standards? The mean percent coverage will be calculated across the four species’ platforms and used as a multiplier to determine point value.

multiplier x 5


Species coverage: Did the solvers provide a platform and reference sample data for all four species?

Y or N


*Eligible submissions will include platforms, data and associated required information for all four species.

How To Enter

To be eligible to compete for the award, prospective solvers must register at by no later than March 16, 2020.

Submission: Instructions on submission of entries will be provided to registered participants during the Kick-off Webinar on March 19, 2020.

Point of Contact

Have feedback or questions about this challenge? Send the challenge manager an email