Water Supply Forecast Rodeo: Hindcast Evaluation

Water managers in the Western U.S. rely on accurate water supply forecasts to better operate facilities and mitigate drought. Help the Bureau of Reclamation improve seasonal water supply estimates in this probabilistic forecasting challenge! [Hindcast Evaluation Arena] #climate

$50,000 in prizes
Completed jan 2024
99 joined

Overview

Accurate seasonal water supply forecasts are crucial for effective water resources management in the Western United States. This region faces dry conditions and high demand for water, and these forecasts are essential for making informed decisions. They guide everything from water supply management and flood control to hydropower generation and environmental objectives.

Yet, hydrological modeling is a complex task that depends on natural processes marked by inherent uncertainties, such as antecedent streamflow, snowpack accumulation, soil moisture dynamics, and rainfall patterns. To maximize the utility of these forecasts, it's essential to provide not just accurate predictions, but also comprehensive ranges of values that effectively convey the inherent uncertainties in the predictions.

Task

The goal of this challenge is to develop probabilistic forecast models that predict naturalized cumulative streamflow volume at the 0.10, 0.50, and 0.90 quantiles. The challenge will occur over multiple stages: the Hindcast Stage evaluated models on historical data simulating real-time forecasting, the Forecast Stage will run in real-time during the 2024 season, and a final prize stage will ask solvers to submit additional materials for the overall and bonus prizes, like model reports and cross-validation prediction, which will be judged by a panel of technical experts.

By improving the accuracy, explainability, and uncertainty characterization of seasonal streamflow forecasts, water resources managers will be better equiped to operate facilities for high flows, mitigate impacts of drought, improve hydropower generation, and meet environmental targets.

Challenge overview

Timeline Overview

The challenge is occurring over multiple stages, each with its own prizes:

  1. Hindcast Stage: Models will be evaluated on historical ground truth data to simulate forecasts that would be made in the past.
  2. Forecast Stage: Models will be run in real-time on a regular cadence from January through July 2024 to issue forecasts for the 2024 season.
  3. Final Prize Stage: In a final stage, solvers will submit updated models and additional materials to compete for Overall Prizes and for the Explainability and Communication Bonus Track prizes.

Prize Overview

Category Prize Pool
Overall $325,000
Hindcast Stage $50,000
Forecast Stage $50,000
Explainability and Communication Bonus Track $75,000
Total $500,000

Hindcast Stage Details

You are currently in the Hindcast Stage for the challenge. In the Hindcast Stage, you will develop your models, and they will have their performance evaluated based on historical data.

The Hindcast Stage has two arenas for submissions.

  • Development Arena: For developing your water supply forecasting models. You will be able to submit hindcast predictions for the historical test set to the challenge platform for evaluation in order to get feedback on your model. Submissions in the Development Arena will not count for prizes.
  • Evaluation Arena (YOU ARE HERE): For the official evaluation of your models on the historical test set. You will submit model code for remote execution as well as a model report. Your test set performance and model report will be evaluated by a panel of technical experts to determine Hindcast Stage prizes.

Making a successful code submission in the Hindcast Evaluation Arena is required to participate in later stages of the challenge, where additional prizes may be won.

Hindcast Stage Key Dates

Challenge launch October 17, 2023
Hindcast Evaluation Arena opens for test submissions November 29, 2023
Data source request deadline December 5, 2023
Runtime dependency request deadline December 7, 2023
Hindcast Evaluation Arena full submissions available December 11, 2023
Hindcast code execution submission deadline December 21, 2023
Hindcast model report deadline January 26, 2024

Prizes Breakdown

Place Prize Amount
1st $100,000
2nd $75,000
3rd $50,000
4th $30,000
5th $20,000
Regional and Lead Time Bonus Prizes $50,000

Overall Prizes

Overall prizes will be awarded based on an evaluation by a panel of technical experts. The evaluation will consider the overall performance across all stages of the challenge as well as a final model report detailing solutions.

Bonus prizes will also be awarded for best performance in forecast subcategories, including regional and long lead time. Additional details about the subcategories will be shared later in the challenge.

You must participate in both Hindcast and Forecast Stages by submitting successfully executed code to be eligible for overall prizes.

Place Prize Amount
1st $25,000
2nd $15,000
3rd $10,000

Hindcast Stage Prizes

October–December 2023

Test your model against historical ground truth data! Train your models and submit code. Your code will be executed to perform inference on a held-out test set. Prizes will be awarded based on a combination of your leaderboard performance and an evaluation of your model report by a panel of judges.

Place Prize Amount
1st $25,000
2nd $15,000
3rd $10,000

Forecast Stage Prizes

December 2023–July 2024

Find out how your model performs with live forecasts for the 2024 season! DrivenData will execute your model at a regular cadence from January through July 2024. After July, forecasts will be evaluated against the true water supply measurements and top leaderboard performers will win Forecast Stage prizes.

You must participate in the Hindcast Stage by submitting successfully executed code to be eligible.

Place Prize Amount
1st $25,000
2nd $20,000
3rd $15,000
4th $10,000
5th $5,000

Forecast Explainability and Communication Prizes

Augment your model to produce forecast explanation outputs alongside its normal predictions. Understanding how predictors drive forecasts and changes in forecasts is important to operational decision makers. Submissions will be judged by a panel of technical experts to select the winners.

You must participate in both Hindcast and Forecast Stages and submit a final model report to be eligible.

How to compete

  1. Click the "Compete!" button in the sidebar to enroll in the competition.
  2. Get familiar with the problem through the overview and problem description. You might also want to reference additional resources available on the About page.
  3. Download the training ground truth data from the Data download page.
  4. Create and train your own model.
  5. Bundle your trained model and prediction code for evaluation in our cloud runtime. See the Code submission format page for more detail.
  6. Click “Submission” in the sidebar, and then “Make new submission”.
  7. To be considered for prizes in the Hindcast Stage, don't forget to submit a model report detailing your methodology. See the Report submission format page for more detail. Once complete, click on "Report Submission" in the sidebar, upload your PDF, and then click "Submit". You're in!

The challenge rules are in place to promote fair competition and useful solutions. If you are ever unsure whether your solution meets the rules, ask the challenge organizers in the competition forum or send an email to info@drivendata.org.

Note on prize eligibility: The term Competition Sponsor in the rules includes the Bureau of Reclamation as well as all federal employees acting within the scope of their employment and federally-funded researchers acting within the scope of their funding. These parties are not eligible to win a prize in this challenge.


Sponsors

This challenge is sponsored by the Bureau of Reclamation


With support from NASA


And with collaborators from the USDA Natural Resources Conservation Service and the U.S. Army Corps of Engineers