Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

AI ATAC 3 Challenge: Efficiency & Effectiveness Afforded by Security Orchestration & Automated Response (SOAR) Capabilities

Artificial Intelligence SOAR Tools for Security Operations Center (SOC) Automation

U.S. Navy

Total Cash Prizes Offered: $750,000
Type of Challenge: Software and apps, Technology demonstration and hardware, Analytics, visualizations, algorithms
Partner Agencies | Federal: Oak Ridge National Laboratories (ORNL)
Submission Start: 12/10/2020 08:00 AM ET
Submission End: 02/12/2021 05:00 PM ET

** Questions and clarifications regarding the challenge are located here. **

Description

General Overview

The Naval Information Warfare Systems Command (NAVWARSYSCOM) and the Program Executive Office for Command, Control, Communications, Computers, and Intelligence (PEO C4I) are conducting a third instance of the Artificial Intelligence Applications to Autonomous Cybersecurity (AI ATAC, pronounced “AI attack”) Challenge (hereinafter referred to as “the Challenge”). The Navy’s Information Assurance and Cybersecurity Program Office (PMW 130) seeks to enhance the Security Operations Center (SOC) using automated artificial intelligence and machine learning (AI/ML) tools to automate the detection and prevention of advanced persistent threat (APT) and other cybersecurity campaign activity. Current SOC operations require a tremendous amount of time and effort to triage alerts, link related logs, perform incident response, and document investigations. Through this AI ATAC Prize Challenge, PMW 130 is soliciting Security Orchestration & Automated Response (SOAR) tools that use AI/ML to enhance their effectiveness for competitive evaluation. Submissions must be accompanied by a submission description white-paper plus an overview video and a demonstration video describing the SOAR tool and associated technologies.

This Challenge seeks to evaluate the utility of SOAR tools for NAVWAR security operation center (SOC) teams. SOAR tools as used in this challenge are technologies to coordinate, manage, and automate an organization’s SOC and federate an organization’s security processes, workflows, and procedures to provide a centralized, coordinated security posture for that organization. SOAR tools should significantly enhance the ability of security operators and analysts to perform their tasks (e.g., alert handling, ticket processing, threat detection, incident response, and post-compromise forensics) by providing machine-powered assistance to human analysts to improve the efficiency and consistency of people and processes. For instance, workflows or playbooks define the set of steps to address a potential threat, and SOAR tools may execute these steps automatically or guide a human analyst through them. SOAR tools using AI/ML should greatly enhance these benefits.

For the purposes of this competition, SOAR tools are defined as tools that perform, coordinate, and automate the following actions:

  • Ingest logs and alerts from a wide variety of security tools in a SOC;
  • Ingest or provide threat intelligence information from both internal and external sources;
  • Combine, coordinate, and enrich logging, alert, and threat data in the tool’s User Interface (UI);
  • Identify common attack patterns and highlight them in the tool’s UI;
  • Automate and ultimately simplify the alert triage and incident response processes via preset and configurable workflows or playbooks;
  • Facilitate simultaneous or asynchronous (e.g., via sharing or handing off) investigations by different, potentially geographically disparate operators, with access to potentially different types of information; and
  • Automate and expedite documentation of triage, incident response, and forensics (e.g., via ticketing).

This Challenge will measure the performance of these tools to determine how well they would provide improvements to U.S. Navy SOCs across the world.

Key Dates

  • 10 December 2020: Kick-off of AI ATAC 3 on Challenge.gov
  • 22 January 2021: Deadline for potential Participants to send an email to indicate intent to participate (details below)
  • 12 February 2021: All submissions due
  • TBD: Candidates selected for Phase 2 will be notified by email

Prizes

Total Cash Prize Purse

$750,000

Prize Breakdown

NAVWARSYSCOM has established $750,000 as the total amount set aside for cash prizes under the Challenge. In the case of a single winner, the entire cash prize will be awarded to the winning entry. In the unlikely event of a tie, NAVWARSYSCOM will determine an equitable method of distributing the cash prizes. If a prize goes to a team of Participants, NAVWARSYSCOM will award the cash prize to the individual/team’s point of contact registered on the Challenge.gov website.

Tax treatment of prizes will be handled in accordance with U.S. Internal Revenue Service guidelines. The winner must provide a U.S. Taxpayer Identification Number (e.g., a SSN, TIN, EIN) to receive the cash prize.

Non-monetary Prizes

NAWARSYSCOM may award, pursuant to Title 10 U.S.C. § 2371b, a follow-on production contract or transaction to one or more Participants who successfully demonstrated an effective AI/ML SOAR capability under the Challenge. The Challenge, however, does not in any way obligate NAVWARSYSCOM to procure any of the items within the terms or bounds of this Challenge from any Participant, including from the Challenge winner(s).

Rules

Each Participant (individual Participant, team of Participants, or commercial entity) shall submit one entry in response to this Challenge. Team or commercial entity entries must identify one individual as a primary point of contact (POC) and prize recipient. By submitting an entry, a Participant authorizes his or her name to be released to the media if the Participant wins the prize.

While reading this section, it may be helpful to refer to the “How to Enter” section, which itemizes the detailed requirements of a submission package. The “How to Enter” section provides instructions for submitting the package. This Challenge will be conducted in two successive phases.

Phase 1 The Phase 1 submission package will provide the following:

  1. Submission description White-paper (document);
  2. Overview and Demonstration videos (each less than 10 minutes long);
  3. The SOAR technology itself (via VM, hardware, and/or any included and necessary software); and,
  4. A setup guide (document). See Submission Requirements below for the specifics of each submission component.

Phase 1 will be used to identify the most promising candidate technologies that will be selected for and advanced to Phase 2 evaluation. These selections will be based on the technology’s Whitepaper document plus the Overview and Demonstration videos. SOAR tool submitters must be available during Phase 1 for any questions by the U.S. Navy and Oak Ridge National Laboratory (ORNL) teams.

PMW 130 will notify all Contestants that advance to Phase 2 using the contact information provided in the submission. The Government, may, but is not necessarily required to, notify Contestants that do not advance.

Phase 2 Phase 2 will involve more in-depth evaluation of candidate technologies by subject matter experts at ORNL and actual SOC operators from both ORNL and the U.S. Navy. It will require a fifth and final submission component, namely (5) a User Tutorial video. Upon notification of selection for Phase 2, further instructions and a deadline will be provided to only the selected technologies for the (5) User Tutorial video.

In addition, technical assistance to ORNL’s team is expected for ensuring expedient and proper set-up of the submitted SOAR tool into ORNL’s environment during Phase 2. Remote support (video- or tele-conferences) will be scheduled with the submission team’s technical point of contact (POC) specified in the Phase 1 submission package.

Once technologies are appropriately configured, Phase 2 evaluation will commence. Phase 2 will include a hands-on test of the SOAR tool, conducted by multiple SOC operators, while red-team attack campaigns are administered. The environment will include relevant background traffic and real users populating the test network. Tests involving collaboration of multiple operators using the SOAR tool may be conducted, both for operators working in the same network SOC and when handing off investigations to operators in a different network’s SOC.

Expertise of SOC analysts is particularly important and relevant to Phase 2, and access to SOC expertise is limited; consequently, Phase 2 will be evaluated on a smaller subset of Phase 1 Participants. Additional tests of each SOAR tool, to measure further quantitative and qualitative metrics, will be performed by ORNL research staff.

All questions regarding the Challenge should be sent via email to AIATAC.PRIZE.CHALLENGE@NAVY.MIL no later than 22 January 2021, 1700 EST. Questions submitted after this deadline may not be addressed.

Eligibility Requirements

The Challenge is open to individual Participants, teams of Participants, and commercial entities. As part of its submission, Participants must either own the intellectual property (IP) in the solution or provide documentation that indicates all IP stakeholders in its submission. The documentation should describe the type of IP and the entity that holds title to the IP. In either case, only one entry for each commercial technology is allowed. Commercial entities must be incorporated in and maintain a primary place of business in the United States (U.S.). Individual Participants and all members of teams of Participants must all be U.S. citizens or U.S. Permanent Residents and be 18 years or older as of 15 December 2020. All Participants (commercial entities or individuals) must have a Social Security Number (SSN), Taxpayer Identification Number (TIN), or Employer Identification Number (EIN) in order to receive a prize. Eligibility is subject to verification before any prize is awarded. Federal Government employees, PMW 130 support contractors and their employees, and Oak Ridge National Laboratory (ORNL) support contractors and their employees are not eligible to participate in this Challenge. Violation of the rules contained herein or intentional or consistent activity that undermines the spirit of the Challenge may result in disqualification. The Challenge is void wherever restricted or prohibited by law.

Terms & Conditions

These terms and conditions apply to all Participants in the Challenge.

The Participant agrees to comply with and be bound by the AI ATAC Challenge Background and Rules (“the Rules”) as well as the Terms and Conditions contained herein. The Participant also agrees that the decisions of the Government, in connection with all matters relating to this Challenge, are binding and final.

The Participant agrees to follow and comply with all applicable federal, state and local laws, regulations, and policies.

This Challenge is subject to all applicable federal laws and regulations. ALL CLAIMS ARISING OUT OF OR RELATING TO THESE TERMS WILL BE GOVERNED BY THE FEDERAL LAWS AND REGULATIONS OF THE UNITED STATES OF AMERICA.

Data Rights

NAVWARSYSCOM does not require that Participants relinquish or otherwise grant license rights to intellectual property developed or delivered under the Challenge. NAVWARSYSCOM requires sufficient data rights and intellectual property rights to use, release, display, and disclose the white paper and tool, but only to the evaluation team members, and only for purposes of evaluating the Participant submission. The evaluation team does not plan to retain entries after the Challenge is completed, but does plan to retain data and aggregate performance statistics resulting from the evaluation of those entries. By accepting these Terms and Conditions, the Participant consents to the use of data submitted to the evaluation team for these purposes.

NAVWARSYSCOM may contact Participants, at no additional cost to the Government, to discuss the means and methods used in solving the Challenge, even if Participants did not win the Challenge. Such contact does not imply any sort of contractual commitment with the Participant.

Because of the number of anticipated Challenge entries, NAVWARSYSCOM cannot and will not make determinations on whether or not third-party materials in the Challenge submissions have protectable intellectual property interests. By participating in this Challenge, each Participant (whether participating individually, as a team, or as a commercial entity) warrants and assures the Government that any data used for the purpose of submitting an entry for this Challenge were obtained legally and through authorized access to such data. By entering the Challenge and submitting the Challenge materials, the Participant agrees to indemnify and hold the Government harmless against any claim, loss or risk of loss for patent or copyright infringement with respect to such third-party interests.

This Challenge does not replace or supersede any other written contracts and/or written challenges that the Participant has or will have with the Government, which may require delivery of any materials the Participant is submitting herein for this Challenge effort.

This Challenge constitutes the entire understanding of the parties with respect to the Challenge. NAVWARSYSCOM may update the terms of the Challenge from as needed without notice. Participants are strongly encouraged to check the Challenge.gov website frequently.

If any provision of this Challenge is held to be invalid or unenforceable under applicable federal law, it will not affect the validity or enforceability of the remainder of the Terms and Conditions of this Challenge.

Results of Challenge

The Challenge winners will be announced on the Challenge.gov website and via email. NAVWARSYSCOM will announce the winners via appropriate channels. If winners receive notification prior to public announcement, winners shall agree not to disclose its winning status until after the Government releases its announcement.

Release of Claims

The Participant agrees to release and forever discharge any and all manner of claims, equitable adjustments, actions, suits, debts, appeals, and all other obligations of any kind, whether past or present, known or unknown, that have or may arise from, are related to or are in connection with, directly or indirectly, this Challenge or the Participant’s submission.

Judging Criteria

Judging Panel

Both Phase 1 and Phase 2 of this Challenge will be reviewed by members of the ORNL Cybersecurity Research Group and/or U.S. Navy subject matter experts.

Judging Criteria

Submissions will be judged based on how they address the SOAR capability criteria below:

  1. Ability to ingest custom logging / alerts
  2. Playbook / Workflow Use:
    • Usefulness of existing playbooks
    • Ability of junior operators to effectively use workflows/playbooks
    • Efficiency gains by using workflows/playbooks
  3. Playbook / Workflow customization:
    • Easy and flexible creation of custom workflows/playbooks
    • Shareability of workflows/playbooks
  4. Task Automation:
    • Quickly resolving and documenting false positive alerts
    • Ability to automate common tasks such as responding to phishing attacks and failed user logins
    • Effectiveness of automation
    • Efficiency gains from automation
  5. Documentation Automation:
    • Ability to prepopulate alert and logging data into tickets
    • Ability to collect all needed data for elevating events to incidents and handing incident to higher tier / higher experience SOC personnel
  6. Ability to rank / score alerts so that analysts can easily prioritize alerts from most to least significant
  7. Collaboration facilitation:
    • Shareability of in-progress and completed investigations
    • Real time collaboration
    • Asynchronous collaboration (hand offs)
    • Ability to provide metrics on how much time / money the SOAR tool is saving the organization
  8. Unique features: Unique features of the SOAR tool may be considered on the basis of facilitating functionality or adding value to the SOC.

To select which tools will advance from Phase 1 to Phase 2, ORNL will conduct a comparative analysis of the submitted white-papers to determine how well they meet the above specified criteria. Navy SOC analysts will review the provided demonstration videos and provide feedback to ORNL on which tools they are most interested in seeing tested based on the SOAR capability criteria listed above.

In Phase 2, ORNL will conduct an analysis of each tool’s ability to rank alerts, ingest data (both non-standard formats and widely supported formats), facilitate playbook creation and execution, automate ticket population and common tasks, and facilitate communication between potentially geographically separated SOCs. The winner(s) will be the submission whose cumulative score across all of these areas is highest.

How to Enter

Entry Instructions

All participants must email AIATAC.PRIZE.CHALLENGE@NAVY.MIL to indicate your intent to participate prior to 22 January 2021 @ 1700 EST.

Only unclassified information may be part of the submission package. The Phase 1 submission package must include these four items:

  1. Submission description White-paper (Document);
  2. Overview and Demonstration Videos (2 videos);
  3. Corresponding SOAR Technology (VM, hardware, and/or any included and necessary software); and,
  4. Setup Guide (Document)

The evaluation team may, but is not required to, contact Participants if a flaw is found in a submission. However, if any submission items are missing or do not meet the following criteria, the Participant will be disqualified. All submissions must be submitted in their entirety prior to 12 February 2021 @ 1700 EST.

Only those entries selected for Phase 2 testing will be required to submit the final item:

  1. User Tutorial (1 video)

To reiterate, the User Tutorial video is not needed with the initial submission package, and will only be required if advanced to Phase 2. PMW 130 will notify all Contestants that advance to Phase 2; the Government, may, but is not necessarily required to, notify Contestants that do not advance.

The submission items for both Phase 1 and Phase 2 must meet the following eligibility criteria.

  1. White-paper Document Submissions Requirements

    White-papers must use the submission template, located at Challenge.gov. It provides the framework for an overview of the proposed technology as well as the following elements:

    • Technical approach (e.g. architecture, deployment overview, algorithm description, model description, performance requirements, endpoint footprint, existing results) including descriptions of the AI/ML components; and
    • Benefits and novelty of the approach within the context of existing academic and commercially available technologies.

Sections 1-5 of the white-paper must be collectively no more than six (6) pages in length.

The white-paper template concludes with a checklist to ensure the submission meets the criteria as detailed in the Challenge Purpose section. Those technologies that do not meet the Scope section criteria may be disqualified or deemed ineligible. Similarly, for those Participants that do not use the submission template, the Government will not consider the submission.

Contestants should mark all materials that it considers to be proprietary with a marking of its choosing. The Government will respect and comply with such markings, if present. However, any such marking should not prevent the evaluation team from evaluating the Participant submission (Please see the Data Rights section below). If any videos are submitted which merit a protective marking, the Participant should note which portions of the video(s) are protectable as part of its submission. Do not submit any Classified information.

  1. Demonstration Videos Requirements

    For Phase 1, each Participant must provide two separate videos – an overview video and a demonstration video, each less than 10 minutes long. The videos can be screen recordings of actual tool usage, an explanatory description with narration, filmed demonstrations, or any other means of showing the solution. Details of required video contents are provided below.

    a. The Demonstration video must be at most 10 minutes, and should provide an introduction to the SOAR technology’s platform, including at a minimum, examples of a user:

    • Viewing events in the SOAR UI,
    • Using a playbook/workflow to handle an incident,
    • How SOC operators can collaborate with each other on an investigation with the tool,
    • How tickets are automatically populated with the tool, and
    • Orchestration of multiple, related incidents or issues. b. The Overview video must be at most 10 minutes and should provide an overview of functions and features of the Participant’s technical solution. The Participant may choose any aspects of their tool to showcase, but areas of interest to the U.S. Navy include:
    • The Threat Intelligence Platform – how the tool knows of likely threat locations and types; who provides the threat data and how;
    • Worthwhile Automated - automatic identification and resolution of simple but voluminous alerts and warnings (accurate methods to reduce repetitive actions by an analyst);
    • Incorporated AI/ML - AI/ML tools, components, or processes that can identify unusual threats or conditions, and alert the operators accordingly;
    • Flexible Integrations - Interoperability or extensibility with other tools or platforms and/or APIs used;
    • Metrics - Examples of metrics generated, especially those that show time saved, value gained, number of incidents handled, et cetera, and
    • Offline/Online – How well the tool works offline (i.e. not in the cloud or only on a network without cloud connectivity).

    The videos should demonstrate the SOAR tool’s capability and highlight the value and ease-of-use of the SOAR tool. Any other desired functionality is welcome within the at-most-10-minute videos. Extra videos or any content after 10 minutes in a video will not be reviewed.

    Video format: Videos must be provided in MP4 file format or provided via a link to be viewed online. If the videos are encrypted or protected, please provide a password.

  2. Technology Software and/or VM Submission Requirements

    The software components for SOAR technology must meet the following criteria:

    • Software Format: It can be in one of the following formats:
      • Software that can be run on modern Windows or Linux OSes (e.g., the U.S. Navy is moving towards all Windows 10 and uses Red Hat Enterprise 7.x in some environments)
      • Exported virtual machine (VM) images in .ovf or .ova format that are compatible with VMWare ESXi 6.7 (i.e., they MUST readily import into VMWare, no conversion should be done)
      • Docker container packages (i.e., using the Docker tar command)
    • Hardware Format:
      • Technologies selected for Phase 2 can provide their technologies pre-installed on hardware server systems or provide co-processing systems for their technology.
    • Cloud and On-Premises (On-prem) Requirements: For testing, submitted technologies will be installed on the Cybersecurity Operations Research Range (CORR) evaluation environment at Oak Ridge National Laboratory (ORNL). Both on-prem and cloud functionality is required for testing as described here:
      • During a portion of the testing, no connection to the internet or cloud should be expected, and all technologies and licenses should be able to function without cloud connectivity.
      • For a portion of testing, ability for collaboration of SOCs of different networks using their own instances of the SOAR tool will be tested. All networks will be inside CORR (ORNL’s testbed). To test interaction of the SOCs/SOAR tools across networks, there are two options, as follows:
        • Internet connections will be provided to enable connections to the cloud. Cloud provision and configuration must be provided by the submitting team if this option is used.
        • Solutions that do not require internet access but do allow coordinating across connected networks within ORNL’s testbed are permitted, in lieu of cloud connections. This simulates connecting multiple SOCs all on a separate enclave, e.g., SIPRnet.

      Only one of the options above is needed. The important necessity is that multiple networks with different instances of this SOAR tool must be able to collaborate either using a cloud component (real internet) or an ORNL-on-prem capability. All necessary components (software, licenses, configuration / setup instructions, and potentially hardware) must be provided.

    • Instances required: Submissions must provide SOAR software for multiple instances to be configured on CORR, specifically, allowing up to 7 SOC operators per network on up to 3 small networks (each network can be assumed to include fewer than 5,000 IPs, with total bandwidth under 10Gb/s). These instances of the submitted SOAR technology must be able to integrate with each other both within and across networks (e.g., leveraging a virtual cloud or dedicated cross-network connection) to facilitate collaboration of SOC operators within and across networks using this SOAR tool. Any needed corresponding tools (e.g., SIEM of Zeek) must be included in the software / VM / hardware submission package and be as configured as possible. Any remaining configuration needed that may be specific to the test network MUST be provided along with the User Tutorial document if the SOAR tool is advanced to Phase 2.
    • Licenses required: Software licenses for these instances must be valid through December 31, 2021 and must function properly without connectivity to the internet.
    • Integration Requirements: Submission must ingest a wide variety of alerts and logs, including but not limited to:
      • Host logs including Windows System Logs, Linux syslogs
      • Host-based defensive software logs including
      • Host firewall logs
      • Anti-virus, malware detection, and/or memory access/scripting violation logs
      • Endpoint detection and response logs
      • Endpoint policy compliance software logs
    • Network-level defensive software logs including
      • Network-level firewall logs
      • Intrusion detection and prevention system output
      • Network-level malware detector alerts
      • Network flow sensors
      • PCAP (packet capture)
      • Zeek logs
    • Vulnerability scanning tool outputs
    • Logs produced by network services, including
      • Active Directory
      • LDAP
      • Kerberos
      • DNS
      • Mail Client
      • DHCP
      • Threat intelligence platform information, both internally and externally
      • Ticketing or other documentation system(s)
  3. Setup Guide Document Requirements

    The setup guide should be a concise, easy-to-follow set of instructions for installing the submission’s VM/Software/Hardware, configuring integrations with subsidiary SOC tools, and integrating multiple instances of this SOAR tool. A technical Point of Contact (POC) for assisting with proper setup and configuration should be included in the white-paper, along with the POC’s phone number and email.

    • Setup time requirements: All required components should be submitted as configured and integrated as much as possible. Submissions must provide sufficiently mature software, documentation (setup guide), and support to the ORNL test team to ensure that the submitted SOAR instances can be configured for use on the up-to-three small networks with at most 16 labor hours of support. Submissions requiring more than 16 labor hours for setup and configuration are subject to disqualification. It is critical that the technical POC be available during business hours during the challenge to avoid disqualification.
    • Setup Support: The Submission team should be prepared to provide configuration support to the ORNL test team remotely (e.g. via phone or video conference) to facilitate proper setup and configuration. Setup configuration and support (virtual) meetings will be scheduled in advance of a needed meeting to accommodate the Challenge schedule.
  4. User Tutorial Video

    The User Tutorial Video is not required with Phase 1 submission. This will only be required of those Participant technologies selected for Phase 2. Further instructions and deadlines will be communicated to these Participants.

    Submissions selected for Phase 2 testing must provide a training or tutorial video that teaches a new user (SOC operator) how to use the (already set up and configured) SOAR tool. The tutorial should take the user under 2 hours to complete and should assume that the tutorial user has a bachelor degree education level and 0-5 years of experience in cybersecurity or related field, and no experience in the specific SOAR tool. The tutorial video may further assume the user is using an instance of the submitted SOAR tool. The User Tutorial will be provided directly to both ORNL and U.S. Navy operators. In particular, the following is to be included:

    • Instructions for basic use of the user interface, including how to query or otherwise visualize ingested logging and alerting data and threat intelligence information.
    • Instructions for how to follow a pre-set workflow or playbook for triage and/or incident response.
    • Instructions for how to create a new workflow or playbook.
    • Instructions for how to use document current status of an investigation (e.g. using an integrated ticketing system).
    • Instructions for working collaboratively with other SOC operators.
    • Instructions for handing off a current investigation to other SOC operators.

More details will be provided to the Participants selected for Phase 2.

Video format: Videos must be provided in MP4 file format or provided via a link to be viewed online. If the videos are encrypted, please provide a password.

Submission packages meeting the criteria above must be shipped by trackable, non-postal delivery (FedEx, UPS, DHL, etc.) and received no later than 12 February 2021 at 1700 EST, to the following address:

For courier services (e.g., FedEx, UPS) use: Cybersecurity Research Group Oak Ridge National Laboratory Attn: AI ATAC Evaluation Team 1 Bethel Valley Road Bldg 6012, Room 209 Oak Ridge, TN 37830

Late submissions will be disqualified and will not evaluated.

Point of Contact

Have feedback or questions about this challenge? Send the challenge manager an email