Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Informational Only

This challenge is no longer accepting new submissions.

AI and Networks Advanced Naval Technology Exercise (AINet ANTX) - Artificial Intelligence Prize Challenge

Project Overmatch is the Department of Navy (DON) priority initiative to modernize Naval warfighting networks.

Department of Defense, Department of the Navy, Naval Information Warfare Systems Command (NAVWAR)

Total Cash Prizes Offered: $100,000
Type of Challenge: Software and apps, Ideas, Analytics, visualizations, algorithms, Scientific
Partner Agencies | Federal: Naval Information Warfare Centers Pacific, Naval Information Warfare Centers Atlantic
Submission Start: 06/11/2021 ET
Submission End: 08/13/2021 03:00 PM ET



Consider a scenario where developers, data scientists, machine learning (ML) engineers, etc., have access to training-quality datasets via the Overmatch Software Armory’s (OSA) secure, shore-based, Virtual Desktop Infrastructure (VDI) enabled environment. Pipelined training-quality datasets are stored in Apache Parquet format and readily available for query, discovery, and retrieval from RESTful Services via Swagger UI, Command Line Interface (CLI), and/or the Java software development kit (SDK). The OSA environment also provides access to pertinent artificial intelligence (AI) tooling, e.g. Anaconda, Amazon Web Services (AWS) SageMaker, etc., which is used to perform data engineering, build/train ML models, and/or develop AI-enabled capabilities (e.g. Natural Language Processing / Natural Language Understanding (NLP/NLU), reasoning, planning, etc.).

The AI Prize Challenge, hereinafter referred to as the “Challenge”, is focused on quickly identifying operationally relevant innovations and emerging AI-enabled technologies. Selected participants will be invited to demonstrate candidate technologies in the Overmatch Software Armory (OSA), which will provide participants VDI access to a secure, government-owned Commercial Cloud Service Provider (CCSP). AI technologies are expected to be containerized and deployed to the Red Hat OpenShift Container Platform (OSCP) version 4.6.

Datasets will be made available to participants in two forms: datasets and filesets. Dataset and fileset descriptions are provided in section, Judging Criteria - AI iRIL, Datasets, and Filesets.

Key Dates

  • Challenge Registration and Submissions Opens: June 2021
  • Virtual Q&A Session with prospective participants: June 30, 2021
  • White Paper/Quad Chart Deadline: August 13, 2021
  • Review/ Down-selection Boards: August 20 - August 23, 2021
  • Announce invited AINet ANTX Challenge participants: August 24, 2021
  • Virtual Q/A Session for participants: August 31st, 2021
  • iRIL Integration Workshop: September 14, 2021
  • Networking Technology Submission: October 22, 2021
  • AINet ANTX (Presentation/Demonstration): November 4, 2021
  • Prize Challenge Board Deliberations: November 5, 2021
  • Announce Prize Challenge Winners: November 10, 2021


Total Cash Prize Pool


Prize Breakdown

  • $75,000 First Prize
  • $25,000 Second Prize



Each participant (individual participant, team of participants, commercial, Government, or academic entity) may submit one entry in response to this challenge. Team entries or commercial entity entries must have an individual identified as the primary point of contact and prize recipient. By submitting an entry, a participant authorizes his or her name and organization to be released to the media if the participant wins the prize.

The entry submission package must include:

  • white paper that clearly describes the technology to be considered
  • quad chart where all content such as text, pictures, diagrams, etc. are provided pre-approved for public release

In order for an entry to be considered, a white paper and quad chart must be submitted no later than August 13, in accordance with the submission guidelines.

Each entry will be initially assessed utilizing the criteria set forth under the section, Assessment Criteria - AI Prize Challenge Invitations. Following initial assessment, Participants may be invited to advance to the second assessment in the Challenge.

If invited to advance, participants will be required to submit the technology described in the initial white paper and quad chart no later than October 22, 2021. The technology will be assessed per Assessment Criteria - AI Prize Challenge during presentation and demonstration November 4, 2021.

In the event the submission package does not include the sufficient level of detail to assess technical approach, or other, the Government team may request clarification throughout the evaluation.

Instructions for classified submissions may be requested via

White Paper Submission Guidelines:

White papers should provide an overview of the proposed technology, technical approach (e.g. architecture, deployment overview, algorithm descriptions, model descriptions, performance requirements, endpoint footprint, existing results, etc.), the benefits and novelty of the approach within the context of existing academic and commercially available technologies, and the dependencies necessary (e.g. data, platform, network connectivity, etc.) to operate the proposed technology. White papers must be no more than six pages in length. All white papers must be submitted via email to: by 1200 PST August 13, 2021. Where appropriate, use protective markings such as “Do Not Publicly Release – Trade Secret” or “Do Not Publicly Release – Confidential Proprietary Business Information” in the header or footer of the white paper.

Quad Chart Submission Guidelines:

Quad charts shall be submitted in accordance with the provided quad chart template with all content (text, graphics, pictures, etc.) pre-approved for public release. The single slide quad-chart must be submitted in Microsoft PowerPoint version 16.0 or greater, which is Microsoft Office 2016, Microsoft Office 2019, and Microsoft Office 365. The file extension can be in either in .PPT or .PPTX. Quad charts must be attached to the same email when white papers are submitted by 1200 PST August 13, 2021. The quad chart shall include a “Distribution A: Approved for Public release” marking in the footer. The submitted file should have the name format “ORGANIZATIONNAME_QUADCHART(.PPT or .PPTX)”. The quad-chart should be a concise summary of the full proposal that can be displayed in a single slide. The Challenge team may use portions of the submitted content for event planning, reports, and for external communication materials. Please use the Quad Chart template.

Technology Submission Guidelines:

Upon selection, invited participants will be provided access to the data catalog and schemas, documentation, and the Overmatch Software Armory (OSA) environment to build and integrate their proposed technology.

Technology submissions will include:

  • Description of the approach, proposed technology, and relevant Operational use case(s)
  • All source code, build scripts, etc. required for data engineering
  • All source code, build scripts, etc. required for model development, evaluation, etc.
  • Documentation describing how to run the technology within the OSA
  • Ease of use, through scripts or user interface, is highly encouraged.

Q&A sessions and an iRIL integration workshop will be held throughout the Challenge where additional details will be provided that describe the OSA environment and architecture, available datasets, relevant scenarios, and operationally relevant conditions. This announcement will be updated with specifics, but questions regarding the challenge must be submitted via email to no later than 5 days prior to the virtual event.

Questions submitted after the deadline may not be addressed.


The Challenge is open to individual participants, teams of participants, and commercial, Government, and academic entities. Entities must be incorporated in and maintain a primary place of business in the U.S. Individual participants and all members of teams of participants must all be U.S. citizens or U.S. Permanent Residents and be 18 years or older as of August 13, 2021. All participants (entities or individuals) must have a Social Security Number (SSN), Taxpayer Identification Number (TIN), or Employer Identification Number (EIN) in order to receive a prize. Eligibility is subject to verification before any prize is awarded.

Submissions from Federal Government employees or NAVWAR support contractors will be evaluated but are not eligible to receive the cash award associated with the Challenge.

Violation of the rules contained herein or intentional or consistent activity that undermines the spirit of the challenge may result in disqualification. The challenge is void wherever restricted or prohibited by law.

These terms and conditions apply to all participants in the Challenge.

Agreement to Terms:

The participant agrees to comply with and be bound by the Challenge Rules (“the Rules”) as well as the Terms and Conditions contained herein. The participant also agrees that the decisions of the Government, in connection with all matters relating to the Challenge, are binding and final.

Data Rights:

NAVWAR does not require that participants relinquish or otherwise grant license rights to intellectual property developed or delivered under the challenge. NAVWAR does require sufficient data rights/intellectual property rights to use, release, display, and disclose the white paper, quad chart, supporting materials, and the model, but only to the assessment team members, and only for purposes of evaluating the participant submissions. The assessment team does not plan to retain entries after the challenge is completed. It does plan to retain data, aggregate performance statistics, and integration and compatibility factors resulting from the assessment of those entries for further analysis. By accepting these terms and conditions, the participant consents to the use of data submitted to the assessment team for these purposes.

NAVWAR may contact participants, at no additional cost to the Government, to discuss the means and methods used in solving the Challenge. Such contact does not imply any sort of contractual commitment with the participant.


NAVWAR may award, pursuant to Title 10 U.S.C. § 2371b, a follow-on prototype agreement or transaction, or Limited Procurement for Experimentation Title 10 U.S.C. § 2373 to one or more participants who successfully demonstrate an operationally relevant technology during the Challenge. If the selected technologies are not yet mature enough for prototype awards, other agreements such as Cooperative Research and Development Agreement (CRADA) may be utilized. This Challenge, however, does not in any way obligate NAVWAR to procure any of the items within the scope of this challenge from the winners. Tax treatment of prizes will be handled in accordance with U.S. Internal Revenue Service guidelines. The winner must provide a U.S. TIN (e.g., SSN or EIN) to receive the cash prize.

In order to gain access to the Overmatch Software Armory (OSA) and pertinent data sets, participants must have access to a DoD Public Key Infrastructure (PKI) / Common Access Card (CAC).

This challenge does not replace or supersede any other written contracts and/or written challenges that the participant has or will have with the Government, which may require delivery of any materials the participant is submitting herein for this challenge effort.


Participants agree to confer, consult, and acquire the consent of the Government prior to the publication or presentation of any Challenge materials, materials associated with the Challenge, or data derived from the Challenge, to assure that no PROPRIETARY INFORMATION or RESTRICTED ACCESS INFORMATION is released, patent rights are protected, ensure accuracy, and that no claims are made on behalf of the Government. Publication and/or presentation may be delayed for a reasonable time to afford needed protection.


The Government will bear costs associated with granting invited participants secure access to the Overmatch Software Armory, via Virtual Desktop Infrastructure (VDI) / SkyDesk accounts. The Government will also provide invited participants an equivalent tranche of AWS storage and compute “credit” to conduct their technology development and integration. However, the government is not responsible for the costs associated with providing participants DoD PKI/CAC card.

The Government is not responsible for any additional costs incurred by challenge participants, to include the development of white papers, quad charts, presentation materials, the model, travel, technology, demonstrations, OSA integration, iRIL integration, and any other associated costs. All other costs not explicitly exempted above that are incurred throughout the execution of the Challenge are the responsibility of the participants.

Results of the Challenge: Winners will be announced at the conclusion of the AINet ANTX event. NAVWAR will also announce the winners on the website, the NAVWAR LinkedIN page, media outlets, and social media channels.

Release of Claims:

The participant agrees to release and forever discharge any and all manner of claims, equitable adjustments, actions, suits, debts, appeals, and all other obligations of any kind, whether past or present, known or unknown, that have or may arise from, are related to or are in connection with, directly or indirectly, this challenge or the participant’s submission.

Compliance with Laws:

The participant agrees to follow and comply with all applicable federal, state and local laws, regulations and policies.

Governing Law:



For events throughout the execution of the Challenge, the Government will provide qualified experts to evaluate the submissions, access to virtual collaboration and simulation environments, access to approved laboratories, overall event planning, and coordination of information assurance services. Two assessments will be conducted throughout the Challenge. White paper submissions will be assessed, prior to invitation to participate in the Challenge, which will require participants to develop and integrate their technologies into the iRIL environment.

Because of the number of anticipated challenge entries, NAVWAR cannot and will not make determinations on whether or not third-party materials in the challenge submissions have protectable intellectual property interests. By participating in this challenge, each participant (whether participating individually, as a team, or as a commercial entity) warrants and assures the Government that any data used for the purpose of submitting an entry for this challenge, were obtained legally and through authorized access to such data. By entering the challenge and submitting the challenge materials, the participant agrees to indemnify and hold the Government harmless against any claim, loss or risk of loss for patent or copyright infringement with respect to such third-party interests.

NAVWAR may update the terms of the challenge from time to time without notice. Participants are strongly encouraged to check the website frequently.

If any provision of this challenge is held to be invalid or unenforceable under applicable federal law, it will not affect the validity or enforceability of the remainder of the terms and conditions of this challenge.


Judging Panel

See: Judging Criteria.

Judging Criteria

Assessment Criteria and Simulation Environment

We seek innovative AI-enabled technologies to optimize Warfighter decision-support at machine speed in relevant, maritime environments. The following sections describe the assessment criteria and the iRIL environment.

Assessment Criteria – NetANTX Challenge Invitations

Conforming white paper and quad chart submissions will be evaluated by a panel of qualified experts using the following criteria which is in descending order of importance:

  • Operational impact of the technology/engineering innovation in the intended mission scenarios and operational environment
  • Estimated technical performance (accuracy, precision, recall, F1 score, AUC ROC, etc.) of the technology/engineering innovation against a hold-out dataset
  • Integration complexity of the technology/engineering innovation
  • Technical maturity of the technology/engineering innovation

When conducting the assessment, the Government reserves the right to take other significant factors as required into consideration, such as:

  • Limitations to use due to Intellectual Property ownership
  • Ease of fielding the technology in existing legacy Naval platforms (e.g. solution that requires many large software dependencies or require significant compute/memory resources, or that run on very specific hardware architectures would be viewed less favorably).
  • Required out-of-band information needed for the AI technology to operate

Based on this initial assessment of the white papers and quad charts, participants will be invited to participate in the challenge. Selected participants will be notified via email communication and will receive invitations to the challenge question and answer (Q&A) Sessions and iRIL Integration Workshops.

Assessment Criteria – AI Prize Challenge

Challenge submissions will be evaluated by a panel of qualified AI experts using the following criteria, in descending order of importance:

  • Operational impact of the technology/engineering innovation in the intended mission scenarios and operational environment
  • Technical performance (see below)
  • Integration complexity of the technology/engineering innovation
  • Technical maturity of the technology/engineering innovation

Technical performance will be judged with metrics appropriate for the submission. Metrics may include:

  • F1 Score
  • Precision
  • Recall
  • Accuracy
  • Other: teams are encouraged to propose other performance metrics relevant to their submission. Include details of scoring methodology and advantages of the metric for Government review.

F1 Score is defined as 2x((precision x recall)/(precision+recall)).

Precision is defined as is the number of true positives (TP) divided by the number of TPs and false positives (FP).

Recall is defined as the number of TPs divided by the number of TPs and the number of false negatives (FNs).

Accuracy is defined as the number of correct predictions made divided by the total number of predictions made, multiplied by 100 to turn it into a percentage.

AUC ROC is defined as “Area under the Receiver Operating Characteristic Curve.” AUC measures the entire two-dimensional area underneath the entire ROC curve from (0,0) to (1,1). The ROC curve is defined as a graph showing the performance of a classification model at all classification thresholds. This curve plots two parameters: True Positive Rate (TPR) = Recall = TP/(TP+FN) and False Positive Rate (FPR) = FP/(FP+TN).

Teams are encouraged to include useful data or performance visualizations.

AI iRIL, Datasets, and Filesets


Participants will be provided with VDI access to the OSA environment, which is generically referred to as an iRIL. OSA provides the processes, infrastructure, and tools underpinning these subsystems for application providers. The OSA is comprised of the following components:

  • Naval Research & Development Establishment (NR&DE) Secure Cloud provides Naval R&D labs and its developers with Infrastructure as a Service, Platform as a Service, security services, and development tools via a CCSP.
  • Overmatch Software Armory (OSA) Tools and Services is a DON Environment to support accelerated application development while making optimal use of infrastructure.
  • Collaborative Staging Environment (CSE) is a DON environment to support accelerated application integration and testing. The CSE provides a cloud-based production-representative test environment for applications to conduct Integration Testing.
  • Agile Core Services (ACS) Provide an application infrastructure platform with a diverse set of services to enable easy and rapid delivery of mission applications (provide an environment with common tools to help Applications deploy faster)
  • Application Arsenal is an enterprise software distribution, installation, and update service that enables delivery of approved software from Ashore to the Tactical Edge.

Technologies that require specific hardware, proprietary components, or on-premises management appliances or consoles will not be considered.

Prize Award Details

Winners will be announced at the conclusion of the AINet ANTX event. NAVWAR will also announce the winners on the website, the NAVWAR LinkedIN page, media outlets, and social media channels.

NAVWAR has established $100,000 as the total amount set aside for cash prizes under this Challenge. A $75,000 first place cash prize will be awarded to the winning entry. $25,000 second place cash prizes will also be awarded. In the unlikely event of a tie, NAVWAR will determine an equitable method of distributing the cash prizes. In the event that an entity ineligible of receiving the cash award wins first and/or second place, NAVWAR will determine the equitable method of distributing the cash prizes.

If a prize goes to a team of participants, NAVWAR will award the cash prize to the individual/team’s point of contact registered via the challenge website, for further distribution to the team, as the team members see fit.

NAVWAR is executing two simultaneous but independent prize challenges the AINet ANTX. This prize challenge announcement, and the details within, refer specifically to the AI Prize Challenge. A companion prize challenge, titled the Networks Prize Challenge, has been announced separately on

How to Enter

Entry Instructions

See: Rules.

Submission URL or Email: