Pushback to the Future: Predict Pushback Time at US Airports (Open Arena) Hosted By NASA

2 weeks left

An image of an aircraft being pushbacked by a tug.


Coordinating our nation’s airways is the role of the National Airspace System (NAS). The NAS is arguably the most complex transportation system in the world. Operational changes can save or cost airlines, taxpayers, consumers, and the economy at large thousands to millions of dollars on a regular basis. It is critical that decisions to change procedures are done with as much lead time and certainty as possible. The NAS is investing in new ways to bring vast amounts of data together with state-of-the-art machine learning to improve air travel for everyone.

In order to optimize commercial aircraft flights, air traffic management systems need to be able to predict as many details about a flight as possible. One significant source of uncertainty comes right at the beginning of a flight: the pushback time. A more accurate pushback time can lead to better predictability of take off time from the runway.

Predicting pushback time depends upon factors like passenger loading, cargo loading, weather, aircraft type, and operator procedures. While available data can be used to improve these predictions, the combination of public and private sources can make it difficult to get access to all of the information needed to make the best predictions. Federated learning (FL) offers immense promise here as an approach to training central ML models using private data held by separate organizations.


In Phase 1 (the current phase), your task is to train a machine learning model to automatically predict pushback time from public air traffic and weather data. Better algorithms for predicting pushback time can help air traffic management systems more efficiently use the limited capacity of airports, runways and the National Airspace System.

To be eligible for prizes, finalists from Phase 1 will have the opportunity to participate in Phase 2, where they will work with NASA to train a federated version of their model.

Competition End Date:

April 17, 2023, 11:59 p.m. UTC

Place Prize Amount
1st $15,000
2nd $12,000
3rd $8,000
4th $7,500
5th $7,500

Prize eligibility

All eligible participants are invited to register to participate in the Open Arena. Only prescreened, eligible participants are able to win final scoring prizes.

For this challenge, cash prizes are restricted to Official Representatives (individual participants or team leads, in the case of a group project) who, at the time of entry, are:

  • age 18 or older, a U.S. citizen or permanent resident of the United States or its territories, and
  • are affiliated with an accredited U.S. university either as an enrolled student or faculty member. Proof of enrollment or employment is required to demonstrate university affiliation.

Furthermore, participants selected as finalists in Phase 1 must participate in Phase 2 in order to be eligible for prizes.

Federal employees acting within the scope of their employment and federally-funded researchers acting within the scope of their funding are not eligible to win a prize in this challenge.

For complete rules on eligibility and prizes see the Competition Rules.

Contest arenas

You are in the Open Arena. Once you have submitted attestation of eligibility, head on over to the Prescreened Arena to make executable code submissions and qualify for the $50,000 in final scoring prizes.

This challenge features two competition arenas which provide different access levels and capabilities.

The Open Arena is the first step in the competition process. Here all participants can enter the outputs of their solutions-in-development to see how they fare against others on the open leaderboard.

After submitting the attestation of eligibility, you will be able to access the Prescreened Arena. Here participants can continue to tweak their solutions, submit their executable code, and see how they perform on the prescreened leaderboard. A submission to the Prescreened Arena is required to be eligible for prizes.

Open Arena

  • Available to all registered participants
  • Access the public ground truth data
  • Submit CSV files with predictions for the leaderboard set
  • View the open leaderboard with live results from the best-scoring submissions

Prescreened Arena

  • Available to approved university-affiliated participants
  • Access the public ground truth data
  • Submit trained models and code to run in the cloud
  • View the prescreened leaderboard with live results from the best-scoring submissions

Only prescreened, eligible participants are able to win final scoring prizes. For more information on staging, submissions, and scoring check out the Problem Description.

How to compete

  1. Click the "Compete!" button in the sidebar to enroll in the competition.
  2. Get familiar with the problem through the overview and problem description. You might also want to reference additional resources available on the about page.
  3. Download the data from the data tab.
  4. Create and train your own model. The benchmark blog post is a good place to start.
  5. Use your model to generate predictions that match the submission format.
  6. Click “Submit” in the sidebar, and then “Make new submission”. You’re in!
  7. If you'd like access to the Prescreened arena to compete for prizes, fill out the Attestation of Eligibility Form. We'll notify you if you're eligible.

This challenge is in collaboration with NASA

Header Image: "Pushback..." by Cory W. Watts is licensed under CC BY-SA 2.0