This challenge is no longer accepting new submissions.
Agile Robotics for Industrial Automation Competition (ARIAC)
Build your own robot control software and win up to $10,000!
National Institute of Standards and Technology (NIST)
Type of Challenge: Software and apps, Scientific
Submission Start: 04/01/2021 ET
Submission End: 04/23/2021 11:59 PM ET
In June 2017, the National Institute of Standard and Technology (NIST) put on the first Agile Robotics for Industrial Applications Competition (ARIAC). The goal of the competition was to test the agility of industrial robot systems, with the goal of enabling industrial robots on the shop floors to be more productive, more autonomous, and to require less time from shop floor workers. For the last three years of the competition, we introduced a cash prize to motivate and expand participation. This is the fifth year of the competition.
In this context, agility is defined broadly to address:
- Failure identification and recovery, where robots can detect failures in a manufacturing process and automatically recover from those failures
- Automated planning, to minimize (or eliminate) the up-front robot programming time when a new task is introduced
- Fixtureless environment, where robots can sense the environment and perform tasks on parts that are not in predefined locations
The competition Participants are required to develop a robot control system (software) to control a robot in a simulated environment. Gazebo (gazebosim.com), which is an open source robotics simulation environment, will be used as the testing platform and the Robot Operating Systems (ROS), which is an open source set of software libraries and tools, will be used to define the interfaces to the simulation system.
- Accepting Qualifier Submissions Through: April 23, 2021
- Qualifier Begin: April 26, 2021
- Qualifier End: April 30, 2021
- Competition Testing Period Begins: May 24, 2021
- Competition Testing Period Ends: 5:00 PM, EDT (UTC-05:00), May 28, 2021
- Announcement of Cash Prize Winners: early June 2021
Total Cash Prize Pool: $17,500
- $10,000 First Prize
- $5,000 Second Prize
- $2,500 Third Prize
Up to three winners will be selected. The Prize Purse for the ARIAC Prize Competition is a total of $17,500. The number of winners will be fewer than three if there are fewer than three Participants that qualify for the competition. The Prize Purse may increase, but will not decrease. Any increases in the Prize Purse will be posted on the Event Website. NIST reserves the right to announce additional winners of non-cash prizes.
The Official Rules are posted on the Event Website, http://www.nist.gov/ariac.
Eligibility to participate in the Competition
Participation in the ARIAC Prize Competition is open to ALL; however, not all participants are eligible to win cash prizes as explained in the next section.
Each Competition Participant (individual, team, or legal entity) is required to register on the ARIAC website. There shall be one Official Representative for each Competition Participant. The Official Representative must provide a name, username (which may serve as a team or affiliation name), email address, and affirm that he/she has read and consents to be governed by the Competition Rules. Multiple registrations per Participant are NOT allowed. At NIST’s discretion, any violation of this rule will be grounds for disqualification from the Competition. Multiple individuals and/or legal entities may collaborate as a team to submit a single entry (Participant’s code submission to a qualifier or final), in which case the designated Official Representative will be responsible for meeting all entry and evaluation requirements. Participation is subject to all U.S. federal, state and local laws and regulations. Participants are responsible for checking applicable laws and regulations in their jurisdiction(s) before participating in the prize challenge to ensure that their participation is legal. The Department of Commerce, National Institute of Standards and Technology shall not, by virtue of conducting this prize challenge, be responsible for compliance by Participants in the prize challenge with Federal Law including licensing, export control, and nonproliferation laws, and related regulations. Participants must not be suspended, debarred, or otherwise excluded from doing business with the Federal Government. Individuals entering on behalf of or representing a company, institution or other legal entity are responsible for confirming that their entry does not violate any policies of that company, institution or legal entity. Any other individuals or legal entities involved with the design, production, execution, distribution or evaluation of ARIAC are not eligible to participate.
Once registered, Participants will have access to the interfaces needed to participate in the competition through the ARIAC website. NIST will create an account to allow the Participants to upload their submissions to the qualifier. Modifications and improvements to a Participant’s control system during the qualifying period are to be expected, but the Participant should have one final control system ready for the final competition.
Eligibility to win a Cash Prize
To be eligible for a cash prize:
- A Participant (whether an individual, team, or legal entity) must have registered to participate and complied with all of the requirements under section 3719 of title 15, United States Code as contained herein.
- At the time of entry, the Official Representative (individual or team lead, in the case of a group project) must be age 18 or older and a U.S. citizen or permanent resident of the United States or its territories.
- In the case of a private entity, the business shall be incorporated in and maintain a primary place of business in the United States or its territories.
- Participants may not be a Federal entity or Federal employee acting within the scope of their employment. NIST employees are not eligible to participate. Non-NIST Federal employees acting in their personal capacities should consult with their respective agency ethics officials to determine whether their participation in this Competition is permissible.
- A Participant shall not be deemed ineligible because the Participant consulted with Federal employees or used Federal facilities in preparing its submission to the ARIAC Prize Competition if the Federal employees and facilities are made available to all Participants on an equitable basis.
In addition, interested Participants who do not meet the eligibility requirements to win a prize (i.e., individuals who are neither a US citizen nor a permanent resident of the United States or non-US-based entities) are encouraged to participate in the Competition. They are invited to register on the ARIAC website and download the training material. The performance obtained by these Participants will be displayed on the ARIAC website in the same manner as the performance obtained by Participants who are eligible to win cash prizes. Participants found to be ineligible for cash prizes may still be publicly recognized. In the event that the prize award normally allotted to the place or rank of an ineligible participant occurs, the cash prize will be awarded to the next eligible participant in the series or ranking. Individuals on the denied persons list or from entities or countries sanctioned by the United States Government are not eligible to participate.
The following terminology is frequently used in this document:
- Order: A list of parts and their goal location on a tray.
- Part: One element of an order.
- Tray: A surface that hold parts.
- Kit: A tray and set of parts that make up an order.
ARIAC requires participants to complete a series of tests centered in an industrial scenario that are based around building kits made up of particular parts. The robot system will work within the environment specified in the work environment section.
There are three different test scenarios that all involve moving parts from a supply location to a tray. The possible supply locations are a conveyor belt, and stationary bins. Challenges will be introduced in each scenario. Scenarios may include, but are not limited to:
- Scenario 1: Baseline Kit Building
The first scenario is intended as a baseline set of tasks for the other test methods to be compared against. The task for this scenario is to pick specific parts and place them on a tray. The robot arms will receive an “Order” that details the list of parts and their target locations. Orders are covered in more detail in the Orders section.
- Scenario 2: Dropped Part
The task for Scenario 2 is identical to Scenario 1, however one or more parts will drop from the robots’ gripper. The robots will need to recover after dropping a part and complete the given Order. Recovery could entail picking up the dropped part or fetching a new part.
- Scenario 3: In-Process Kit Change
While the robots are in the middle of assembling a kit, a new high priority order will be received that needs to be completed as fast as possible. The robots will need to decide how best to complete this new order, and then complete the previous order.
The competition will consist of 15 trials: 5 trials of each of the 3 scenarios. Each trial will receive a score based on completion and efficiency metrics outlined in the Scoring section.
The Simulation environment is a representation of an industrial kitting work cell and assembly workstations. One robot arm on a linear rail, one robot on a gantry system, a conveyer belt, parts bins, trays, and assembly stations. The conveyor belt is a 1 m wide plane that transports objects across the work environment at a fixed speed of roughly 0.2 m/s. Parts continuously appear on the belt for the duration of the trial. When parts reach the end of the conveyor belt they are automatically removed. Teams can control the conveyor belt during development, but not during the final competition.
There are eight part bins that may be used for building kits. Parts in these bins will not be replaced once used.
There are two robot arms; one mounted on a single base on a linear actuator that runs parallel to the conveyor belt, the other mounted hanging from a two-dimensional gantry that operates both parallel and perpendicular to the conveyor belt. The Linear actuator measures 4 m.
Four Automated Guided Vehicles (AGV) are located along the Linear actuator. Kits are built on top of these AGVs. A team will programmatically signal the AGVs when the kits are ready to be taken away. The signaled AGV will depart for the indicated Assembly Station.
The trays used for assembling kits are flat trays measuring 0.5 x 0.7 m.
The two robot arms used in each trial will be Universal Robots UR10.
The Robot arms’ positions are controlled through the linear actuator or gantry on which they are mounted.
The end of the arms are equipped with a vacuum gripper. The vacuum gripper is controlled in a binary manner and reports whether or not it is successfully gripping an object.
A Participant can place sensors around the environment. Each sensor has a cost that factors into the final score.
A sample of available sensors may include:
- Break beam: reports when a beam is broken by an object. It does not provide distance information.
- Laser scanner: provides an array of distances to a sensed object.
- Cognex logical camera: provides information about the pose and type of all models within its field of view.
- Proximity: detects the range to an object.
An order is an instruction containing kits for the robot system to complete.
Each order will specify the kit to be assembled, i.e. the list of parts to be put in the kit.
Each specified part has the following structure:
- The type of part.
- The position and orientation of the part on the tray.
Above each AGV is a quality control sensor that detects faulty parts. If faulty parts are detected while teams are filling trays, those parts should be removed from the tray and replaced with another part of the same type. Faulty parts are considered unwanted parts: they will not count for any points when the kit is submitted, and they will cost teams the all-parts bonus if left in trays.
Each trial will consist of the following steps:
- The robots programmatically signal that they are able to begin accepting orders.
- The first Order (Order 1) is sent to the robots.
- A fixed amount of time is allowed to complete the order.
- In the case of the Dropped Part testing method, up to three parts will be forcibly dropped from the gripper.
- In the case of the In-Process Kit Change testing method, a new Order (Order 2) will be issued that is of higher priority than the previously issued Order 1. When Order 2 is complete, building of Order 1 is to resume.
- The robots signal programmatically when a kit is complete and ready for quality control.
- The robot system will be notified that the trial is over. The trial is over when time runs out or all Orders have been fulfilled.
TERMS AND CONDITIONS
Intellectual Property Rights
Other than as set forth herein, NIST does not make any claim to ownership of your entry or any of your intellectual property or third party intellectual property that it may contain. By participating in the ARIAC Prize Competition, you are not granting any rights in any patents or pending patent applications related to your entry; provided that by submitting an entry (i.e., participating in the Competition), you are granting NIST certain limited rights as set forth herein.
However, following the competition, Participants in the ARIAC competition will be invited, but not required, to upload their code to a NIST central repository or to otherwise make it available on a publicly accessible repository or website of their choice which will be made available on an unrestricted, irrevocable, royalty-free basis to the community as a whole, with the goal of furthering the state of the art in robot agility. The decision of whether or not to upload their code will not impact the scoring.
By submitting an entry, you grant to NIST the right to review and score your entry as described in the section “Basis on which finalists and winners will be selected,” to describe your entry in connection with any materials created in connection with the Competition including making public your name, scores, and the date/time that score was obtained at the Event Website, and to have the Judges, Competition administrators, and the designees of any of them, review your entry.
By submitting an entry in the ARIAC Prize Competition, you grant a non-exclusive, irrevocable, paid up right and license to NIST to use your name, likeness, biographical information, image, any other personal data submitted with your Entry and the contents in your entry, in connection with the ARIAC Prize Competition for any purpose, including promotion and advertisement of the competition and future challenges.
By submitting an entry in the ARIAC Prize Competition, you grant a royalty-free, non-exclusive, irrevocable, worldwide license to NIST to display publicly and use for promotional purposes your entry and its contents, including but not limited to the video recording of the robot motion in the simulated environment produced by the submission of your entry (“demonstration license”). This demonstration license includes the right to post or link to your entry and its contents on the Department of Commerce, National Institute of Standards and Technology websites, the challenge website, and other websites that NIST may deem appropriate, and to include use of your entry and its contents in any other media, electronic, video or print, worldwide.
You agree that nothing in these Rules grants you a right or license to use any names or logos of NIST or the Department of Commerce, or any other intellectual property or proprietary rights of NIST or the Department of Commerce or their employees or contractors. You grant to NIST the right to include your name and your company or institution name and logo (if your entry is from a company or institution) as a Participant on the Event Website and in materials from NIST announcing Winners, Finalists, or Participants in the Competition. Other than these uses or as otherwise set forth herein, you are not granting NIST any rights to your trademarks.
Entries containing any matter which, in the sole discretion of NIST, is indecent, defamatory, in obvious bad taste, which demonstrates a lack of respect for public morals or conduct, which promotes discrimination in any form, which shows unlawful acts being performed, which is slanderous or libelous, or which adversely affects the reputation of NIST, will not be accepted. If NIST, in its sole discretion, finds any entry to be unacceptable, then such entry shall be deemed disqualified and will not be evaluated or considered for award. NIST shall have the right to remove any content from the Event Website in its sole discretion at any time and for any reason, including, but not limited to, any online comment or posting related to the Competition.
To advance the goals of NIST’s robot agility research and other relevant technical programs, following the competition, NIST may negotiate a license for the use of intellectual property developed by a registered Participant in the ARIAC Prize Competition.
By making a submission to the ARIAC Prize Competition, you agree that no part of your submission includes any confidential or proprietary information, ideas or products, including but not limited to information, ideas or products within the scope of the Trade Secrets Act, 18 USC §1905. Because NIST will not receive or hold any submitted materials “in confidence,” it is agreed that, with respect to your entry, no confidential or fiduciary relationship or obligation of secrecy is established between NIST and you, your entry team, the company or institution you represent when submitting an entry, or any other person or entity associated with any part of your entry.
By participating in the ARIAC Prize Competition, you represent and warrant that all information you submit is true and complete to the best of your knowledge, that you have the right and authority to submit the data on your own behalf or on behalf of the persons and entities that you specify, and that your data uploaded to the ARIAC Website: (a) is your own original work, or is submitted by permission with full and proper credit given within your entry; (b) does not contain confidential information or trade secrets (yours or anyone else’s); (c) does not knowingly violate or infringe upon the patent rights, industrial design rights, copyrights, trademarks, rights in technical data, rights of privacy, publicity or other intellectual property or other rights of any person or entity; (d) does not contain malicious code, such as viruses, malware, timebombs, cancelbots, worms, Trojan horses or other potentially harmful programs or other material or information; (e) does not and will not violate any applicable law, statute, ordinance, rule or regulation, including, without limitation, United States export laws and regulations, including, but not limited to, the International Traffic in Arms Regulations and the Department of Commerce Export Regulations; and (f) does not trigger any reporting or royalty or other obligation to any third party.
Limitation of Liability
By participating in the ARIAC Prize Competition, you agree to assume any and all risks and to release, indemnify and hold harmless NIST from and against any injuries, losses, damages, claims, actions and any liability of any kind (including attorneys’ fees) resulting from or arising out of your participation in, association with or submission to the ARIAC Prize Competition (including any claims alleging that your entry infringes, misappropriates or violates any third party’s intellectual property rights). In addition, you agree to waive claims against the Federal Government and its related entities, except in the case of willful misconduct, for any injury, death, damage, or loss of property, revenue, or profits, whether direct, indirect, or consequential, arising from your participation in the ARIAC Prize Competition, whether the injury, death, damage, or loss arises through negligence or otherwise.
NIST is not responsible for any miscommunications such as technical failures related to computer, telephone, cable, and unavailable network or server connections, related technical failures, or other failures related to hardware, software or virus, or incomplete or late entries. Any compromise to the fair and proper conduct of the ARIAC Prize Competition may result in the disqualification of an entry or Participant, termination of the ARIAC Prize Competition, or other remedial action, at the sole discretion of NIST. NIST reserves the right in its sole discretion to extend or modify the dates of the ARIAC Prize Competition, to modify the test and training data provided at the Event Website, and to change the terms set forth herein governing any phases taking place after the effective date of any such change. By entering, you agree to the terms set forth herein and to all decisions of NIST and/or all of their respective agents, which are final and binding in all respects.
NIST is not responsible for: (1) Any incorrect or inaccurate information, whether caused by a Participant, printing errors, or by any of the equipment or programming associated with or used in the ARIAC Prize Competition; (2) unauthorized human intervention in any part of the entry process for the ARIAC Prize Competition; (3) technical or human error that may occur in the administration of the ARIAC Prize Competition or the processing of entries; or (4) any injury or damage to persons or property that may be caused, directly or indirectly, in whole or in part, from a Participant’s participation in the ARIAC Prize Competition or receipt or use or misuse of an Award. If for any reason an entry is confirmed to have been deleted erroneously, lost, or otherwise destroyed or corrupted, the Participant’s sole remedy is to submit another entry in the ARIAC Prize Competition.
Termination and Disqualification
NIST reserves the authority to cancel, suspend, and/or modify the ARIAC Prize Competition, or any part of it, if any fraud, technical failures, or any other factor beyond NIST’s reasonable control impairs the integrity or proper functioning of the ARIAC Prize Competition, as determined by NIST in its sole discretion.
NIST reserves the right to disqualify any Participant it believes to be tampering with the entry process or the operation of the ARIAC Prize Competition or to be acting in violation of any applicable rule or condition. Any attempt by any person to undermine the legitimate operation of the ARIAC Prize Competition may be a violation of criminal and civil law, and, should such an attempt be made, NIST reserves the authority to seek damages from any such person to the fullest extent permitted by law.
Verification of Potential Winner(s)
All potential winners are subject to verification by NIST, whose decisions are final and binding in all matters related to the ARIAC Prize Competition.
Potential winner(s) must continue to comply with all terms and conditions of the ARIAC Prize Competition Rules described herein, and winning is contingent upon fulfilling all requirements. In the event that a potential winner, or an announced winner, is found to be ineligible or is disqualified for any reason, NIST may make an award, instead, to another Participant.
Privacy and Disclosure under FOIA
Except as provided herein, information submitted throughout the ARIAC Prize Competition will be used only to communicate with Participants regarding entries and/or the ARIAC Prize Competition. Participant entries and submissions to this competition may be subject to disclosure under the Freedom of Information Act (“FOIA”).
15 U.S.C. 3719, as amended.
The NIST Director will appoint a panel of three qualified judges. The Judges will be robot agility and/or simulation experts from inside and/or outside of NIST. The Judges will determine winners according to the Judging Criteria described herein. The Judges may not have personal or financial interests in, or be an employee, officer, director, or agent of, any entity that is a registered Participant in the Competition and may not have a familial or financial relationship with an individual who is a registered Participant. In the event of such a conflict, a Judge must recuse himself or herself and a new Judge may be appointed.
Basis on which Finalists and Winners will be selected
A qualifier will be performed approximately one month before the finals. The automated metrics described below will be used in the qualifier to determine which team(s) progress to the finals. As described in the Judging Criteria below, these same automated metrics will also be used in the finals, in conjunction with the judging panel.
A portion of the scores will be automatically calculated, shared with the Participants, and publicly posted for each Trial at the conclusion of the Finals (after all Trials are run) as a combination of cost and performance metrics, described below in the Trial Score Calculation section. A Participant’s final automated score is the sum of scores for each of the 15 competition Trials.
The following values are calculated for a Participant’s system setup. As Participants are to use the same system setup for all Trials, the values will remain unchanged between Trials.
The choices made by the competitors (choices of sensors) will have a specific cost associated with them. The final cost for individual sensors will be made available to the Participants at least two weeks before the Finals. For planning purposes, Participants should use the values laid out on the Event Website: nist.gov/ariac.
The sum of these cost choices made by each team will be compared to the baseline cost set by the competition administrators (which will be announced on the Event Website: nist.gov/ariac) and made into a cost factor for the scoring. The team costs are designated as TC, and the baseline cost is designated as BC. The cost factor, is then calculated as:
Performance metrics that cover both completion and efficiency are calculated for each Trial separately. Completion captures the quality of the orders fulfilled (that the kit trays contain the correct products in the correct position/orientations and colors, if applicable); efficiency captures the responsiveness in fulfilling orders (that the orders were filled quickly).
Each trial, since it has the option to have more than one order, will have a certain set of scores for each order. Each order consists of a list of products needed and a prescribed position and orientation for each product within the box. The completion score, CS, is a combined score for each order, starting with 1 point for each product that is placed in the box. For each part that is the correct color (if applicable), an additional 1 point is awarded. For each product that is within 3 cm of the correct position, and within 0.1 radians of the correct orientation will result in an additional point. Next, since the main focus of the competition is to have fulfilled orders, any order with all of the products in the confines of the box, of the correct color (if applicable), and in the correct position/orientation receives an additional point for each product. So, for an order kj with i products in it would have a maximum score of 4i points.
The Efficiency of the approaches chosen by the teams are accounted for in the scoring by using the times that each team takes to complete each trial. The times are counted in seconds from the start of the trial until the first kit is completed and is designated as T01. In the trials with multiple orders, the time, Ti is counted in seconds from the time that the ith order is sent to the team. For each trial, an average of the times across the teams (ATi) is compared to the individual team time to calculate the Efficiency Factor:
If a team’s system times out for a trial, the efficiency factor for the trial is set to 0 and the trial time is not used to calculate the average times for that trial.
For the multi-order trials where one of the orders is a high priority order, an additional multiplier, h, is applied with the efficiency factor to give a higher proportion of the score to that order, which is initially set to a value of 3. The final h value will be made available to the Participants at least two weeks before the finals.
Trial Score Calculation
All of the scoring factors described above are combined as shown below to calculate the score for each trial:
For the trials where there is no second, higher priority, order, the following modifications are used: CSK2 = CSK1 , EF2 = 0. This results in the simplified scoring equation:
Final Rank Score Calculation
The final rank score is the equally weighted sum of all trial scores. As mentioned in the Competition Scenarios section, trials are expected to focus on the Baseline Kit Building, Dropped Part, and In-Process Kit Change scenarios, among others.
Winners: The Winners of cash prizes will be determined by three Judges appointed by the NIST Director, using the Judging Criteria outlined herein.
For each entry, the Judges will receive the video output for each competition trial run in the simulated environment, along with the description of the agility challenges that were being tested in each trial. The Judges will not receive the automated scores that are generated as part of each ARIAC trial.
Each Judge will provide a single score per entry, out of 20 total points, based on the Judging Criteria listed below. This score should provide an overall impression of the entire entry across all trials. Each Judge will provide their score independently to the Challenge Manager, Craig Schlenoff, without interaction with other Judges. The average of these three scores will be added to the points from the automated score to determine the competition winner. In the event of a tie score (or scores), the Challenge Manager will convene the three judges to determine winners by consensus.
The overall score will be broken down as follows:
- Overall Performance based on the automated scoring metrics described above (80 points): Using the automated scoring metrics described herein, the first-place entry will be awarded 80 points, the second-place entry will be awarded 70 points, the third-place entry will be awarded 60 points, and so on.
Average of Judge Scoring (20 points): The judges will provide a total score of 20 points, broken down as follows:
- Novelty/Innovativeness: Approach taken develops new techniques to advance industry robot agility (Up to six points)
- Ability for Industry to Implement: Approach taken can reasonably be applied by industry to solve similar challenges. (Up to six points)
- Alignment with the Spirit of the Competition: Approach taken provides a practical and realistic approach that directly addresses the stated goals of the ARIAC Competition. (Up to eight points)
NIST will announce the Finalists via the Event Website as well as those Finalists that have been awarded a cash award (each, an “Award”). The anticipated number and amount of the Awards that will be awarded for this Competition is set forth in these rules; however, NIST is not obligated to make all or any Awards, and reserves the right to award fewer than the anticipated number of Awards in the event an insufficient number of eligible Participants meet any one or more of the Judging Criteria for this Competition, based on NIST’s sole discretion. Using the Judging Criteria described herein, three qualified Judges (appointed by the NIST Director) will determine the winners of the ARIAC Prize Competition.
The winner verification process for being eligible to receive an Award includes providing the full legal name, tax identification number or social security number, routing number and banking account to which the prize money can be deposited directly. Return of any notification as “undeliverable” will result in disqualification. The verification form must be returned within seventy-two (72) hours of receipt. After verification of eligibility, Awards will be distributed in the form of electronic funds transfer addressed to the Official Representatives specified in the winning entries. That Official Representative will have sole responsibility for further distribution of any Award among Participants in a group entry or within a company or institution that has submitted an entry through that representative. The list of entries receiving Awards for the Competition, including the names of all members of a team, will be made public according to the timeline outlined on the Event Website.
Winners are responsible for all taxes and reporting related to any Award received as part of the Competition.
All costs incurred in the preparation of Competition entries are to be borne by Participants.
How to Enter
An interested Participant (individual, team, or legal entity) must initiate the process of participating in the Competition by registering at the ARIAC website. The party is then given access to view the documentation and download the tutorials (which provides instructions on how to run the simulations) from the ARIAC website. In the event NIST determines modifications to the documentation and tutorials are needed, all website registrants will be notified by email.
Once registered, the Participants are eligible to participate in the qualifier. The qualifier will occur in the April 2021 timeframe (specific deadlines will be published at the official challenge website and all Participants notified by email). Participants must complete the qualifier to be eligible to compete in the final competition. A minimum score will be determined, based on the metrics described below, which Participants have to meet or exceed in the qualifier to be eligible for the final competition.
Participants will run the qualifier from their own location. They will have two weeks from the date that the qualifier is released on the website (available to be downloaded) to submit their software control system that addresses the challenges in the qualifier. Registered participants will be emailed when the qualifier is released. They should run the qualifier at their own location and submit the resulting output files generated by their control system (listed below). Their output files should be uploaded to an account (nfiles.nist.gov) that NIST will set up specifically for each Participant.
Specific files that need to be submitted include:
- Their environment configuration file specifying your choice of sensors.
- A performance log file that contains information about the task completion.
- A simulation state log file that contains information about the state of Gazebo during the trial.
Your output files should be uploaded to an account (nfiles.nist.gov) that NIST will set up specifically for each Participant.