Pale Blue Dot: Visualization Challenge

Use public Earth observation data to create a visualization that furthers the Sustainable Development Goals of zero hunger, clean water and sanitation, or climate action.

Space Study program
jan 2024
1,591 joined

Problem description

Your goal in this challenge is to create a visualization using Earth observation data that advances at least one of the following Sustainable Development Goals (SDGs):

This competition does not require advanced technical or coding skills. Participants at all skill levels are welcome!

Getting started

Steps to create a basic submission:

  1. Identify a dataset to use, and decide how you'll access it. See the data section for detailed requirements.
    • For an overview of possible datasets to use, check out the data resources blog post. Find a dataset you are interested in, and follow the steps in one of the tutorials linked under "Getting started".
    • To identify more publicly available datasets for specific issues you're interested in, check out NASA's Data Pathfinders page. There are pathfinders for a variety of issues, from water quality to agriculture. Each pathfinder provides an overview of datasets relevant to the issue and information about how to access each one.
  2. Identify a decision or action that this dataset could inform, and that has an impact on at least one of they key competition SDGs (zero hunger, clean water and sanitation, climate action).
  3. Create your visualization! There are no specific technical requirements. For example, any of the below would be a valid way to create a submission:
    • Use Python to access data through an API and generate an interactive visualization using Python
    • Download the data, load it into an Excel sheet, and create a visualization in Excel
    • Many, many more!
  4. Write up a short summary of your visual, per the submission format.
  5. Zip up your summary with an image of your visual! If you'd like to be considered for the Best Overall prize, make sure to include a detailed report too.

For more inspiration, check out some examples of how Earth observation data can further the SDGs. You can pick up key skills through NASA's Open Science 101 online training program or a live tutorial.


Your visualization must use at least one publicly available Earth observation dataset collected by a U.S. government agency. "Earth observation data" means observations about the Earth collected in space, such as satellite data, airborne, and in-situ sensors. The data resources blog post suggests some datasets that satisfy this requirement. However, it is not a comprehensive list and we encourage you to explore other datasets too. If you are unsure whether a specific dataset meets this requirements, just ask! Head over to the competition forum to post questions and to find teammates.

You may access each dataset however you like, as long as the access method is freely available and does not require a paid subscription. For example, using third-party tools like Microsoft's Planetary Computer to access data programmatically is allowed. Downloading data manually from U.S. government repositories is also allowed.

Supplementary data

You may use any additional datasets, regardless of whether they are Earth observation data, as long as they are publicly and freely available. However, you must ensure that you have the correct rights and permissions to use and share each additional dataset. To do this, you can usually check the license under which the data is shared.

The Guide to Open Science from NASA's Transform to Open Science (TOPS) mission has a useful guide to licenses, including both choosing licenses for your own work and understanding other licenses.

We encourage you to share useful supplementary datasets on the Community Code page or the competition forum.

Additional tools

You may use any additional tools as long as they are publicly available and free to use. For example, if you want to use OpenAI's ChatGPT, you must use the ChatGPT web application, which is free, and not the OpenAI API, which has a cost per token. Make sure to clearly document the steps you used so your visualization can be reproduced by others.

At this time, Google Earth Engine does not clearly meet these rules for free use. Participants are encouraged to access data through alternate sources, and not through Google Earth Engine.


A representative panel of experts on Earth observation data and the Sustainable Development Goals will review submissions. Submissions will be judged based on the rubric below.

Impact (20%)

Integrity (20%)

  • Has the team thoughtfully engaged with members of the community primarily affected or with the broader context of the chosen issue (historical, social, political, etc.)?
  • Does the submission demonstrate ethical and equitable science?

Rigor (20%)

  • Is the visualization built on sound quantitative analysis?
  • Does it demonstrate a deep understanding of the data used and tools available?

Usability (20%)

  • Could others in the research or scientific community understand and build on the work?
  • Is the process of creating the visual open and transparent?

Interpretability (20%)

  • How easy is it to accurately interpret the visualization?

The detailed report prompts are provided to help you fully address each metric in your submission. For more details about how to interpret these criteria, see the section on open science.

Submissions are required to be in English, but will not be judged based on English fluency. Judgment will be based on the content and ideas communicated. For example, participants may choose to write in a different language, and then use a tool like Google translate to submit in English.

Submission format

Your submission should be a ZIP archive containing the following components:

  1. An image of your data visualization (visual.png)

  2. A short summary of less than 1,800 characters (summary.pdf)

  3. (Only required for Best Overall prize) A detailed report, which can be submitted either as a 1-4 page written report (report.pdf) or a video report that is a maximum of 7 minutes long (report.mp4). A report is required for a submission to be considered for the Best Overall prize, but is not required to be considered for an Honorable Mention for Compelling Visuals.

Your submission can have significant non-visual components as well, as long as there is a visualization involved.

You are only allowed to make one submission. To make changes, you can delete and re-upload your submission as many times as you like. Only the last entry submitted will be considered.

The ZIP archive of your submission cannot be larger than 100MB.

Participants can submit individually, or form teams of up to four people. Participants are encouraged to find teammates through the competition forum.

1. Image

Submit a static image showing your data visualization. This file must be named visual.png, visual.jpg, visual.jpeg, or visual.pdf.

If your visualization is interactive, you should also submit a short video demonstration or a URL where the visualization can be viewed. When selecting the "Best overall" prize winners, interactive visuals may be more likely to satisfy the "Impact" metric of the evaluation criteria than static images.

  • To submit a video demo, include an MP4 file called visual_demo.mp4 that is a maximum of 3 minutes long. If you are submitting the detailed report as a video, you may choose to combine the video demo with your report rather than providing two separate MP4 files.
  • To submit a URL, include the link in your detailed report.

If you are submitting a video demonstration, a static image file is still required as well.

2. Summary

Submit a brief, 1-paragraph written summary of your submission (1,800 characters maximum). This file must be named summary.pdf. Your summary should include:

  • A brief description of what your visual shows (1-2 sentences)
  • A list of all the datasets you used
  • Which Sustainable Development Goal(s) you hope to advance (zero hunger, clean water and sanitation, climate action)
  • A list of all the tools you used to build your visualization (e.g., Python, Planetary Computer, etc.)

3. Detailed report

The detailed report should describe your methodology and explain how your submission meets each of the evaluation criteria. Use the prompts provided below to help structure your report to address all of the criteria.

A detailed report is required to be considered for the Best Overall prize. However, it is not required to be considered for the "Honorable mentions for compelling visuals" prizes.

Detailed report prompts
  1. How does your visual inform a decision or action that furthers one or more of the key competition SDGs (zero hunger, clean water and sanitation, climate action)?
  2. How did you create your submission? Include the tools you used (eg. Python, Excel, specific python packages), how you processed the data, and (if applicable) how you managed your codebase. If you have a public repository with code, you can share a link here.
  3. What motivated you to choose this topic?
  4. How did you learn about the broader context of your chosen issue (e.g., historical, social, political)? This could include drawing on the lived experiences of team members, reading articles and literature, conducting interviews with community members, etc. Did what you learned change your approach?
  5. What are the ethics and/or equity issues you considered? What are some possible strategies or approaches for addressing them?
  6. Would your team like to share the URL of an interactive visualization?

The report can be submitted as either a written document or a video. When judging, there will be no preference between written reports and video reports.

Written report option

To submit a written report, include a file called report.pdf. Written reports should be:

  • 4 pages maximum including figures and tables
  • On paper size 8.5 x 11 inch or A4, with minimum margins of 1 inch
  • Minimum font size of 11
  • Minimum single-line spacing
  • In PDF file format

To create a written report, download the template from the data download page and simply fill in the answer to each prompt.

Video report option

To submit a video report, include a file called report.mp4. Video reports should be MP4 files that are 7 minutes long maximum. Videos are not expected to have professional-level quality and polish, but should succinctly answer the prompts above. Substance matters more than slickness. For example, you may submit a simple video of yourself speaking, or a screen recording with narration. Participants are not required to appear in their videos.

If your visualization is interactive, you are also allowed to submit a video demo. If applicable, you may combine the video demo with your video report rather than submitting two separate files. In that case, the combined video can be up to 10 minutes long maximum.

Creating a ZIP archive

Below are instructions to create a ZIP archive from your submissions files manually, at the command line, or with Python.


  1. Put all of the files you want to include in your submission in one folder.
  2. Open the file finder app on your computer.
  3. Right-click on the folder and select "Compress". On some operating systems, you may have select "Send to", and then "Compressed folder".

At the command line

  1. Put all of the files you want to include in your submission in one folder.
  2. In the command line, run zip -r path/to/submission/folder path/to/new/

    For example, say I have all of my submission files saved in a folder called submission_files and I want to save out a zipfile called I would run:

    $ zip -r submission_files
        adding: submission_files/ (stored 0%)
        adding: submission_files/visual.png (stored 0%)
        adding: submission_files/summary.pdf (stored 0%)
        adding: submission_files/report.pdf (stored 0%)

With Python

from zipfile import ZipFile

# Define where the zipped file should be saved
save_to = ""

# Define the list of files to include
include_files = [

# Define the zip file object and add files
with ZipFile(save_to, "w") as zip_object:
    for file in include_files:

Once you've got your ZIP file, you're ready to submit! Head over to the submissions page to upload your file.

Guide to open science

Open science is defined as the principle and practice of making research products and processes available to all, while respecting diverse cultures, maintaining security and privacy, and fostering collaborations, reproducibility and equity.

Below, we have provided concrete ways to follow the practices of open science when creating a data visualization submission. These principles explain in more detail how to satisfy the "Integrity" and "Usability" metrics of the evaluation criteria. To learn even more, register for NASA's Open Science 101 online training program!

Practical recommendations

Be transparent. Document and share the steps that you took to create your submission. Include details like where your data came from and how you computed any reported statistics so that someone else could recreate your visualization using your original data source.
Collaborate. Engage with others in the competition forum, use the forum to create teams across disciplines and countries, and share your work on the community code board. You could even win the "Community Code Bonus Prize"!
Consider in context. Learn about the context of your chosen issue (historical, social, political, etc). This will be easiest if one of your team members is part of the community primarily affected by your chosen issue. However, you could also accomplish this through reading, or through conducting interviews with community members. Think about how the context impacts your work, and in turn how your treatment of the issue could perpetuate inequalities or subvert them.
Identify and mitigate biases. Consider biases in how the datasets you are using were collecting and in how the visual could be interpreted. Who is represented, and who may be excluded? Come up with ideas for mitigating these risks. For example, suggest limitations for how and when your tool should be used. For a more comprehensive guide check out Deon, DrivenData's ethics checklist for data science projects.
Make it accessible. Per the submission instructions, use data and submission file formats that anyone could access with free software. Think about barriers for accessing and interpreting your work, especially for members of the community primarily affected by the issue and for key decision makers. For example, use simple language that makes it easy for a broad audience to interpret the visual correctly. If more complex accessibility issues like internet access are relevant, you may outline how these could be addressed in the future.
Write reproducible code. If your submission involves a codebase, write your code in a way that is easy for others to follow or contribute to. This includes using best practice tools for open science like Github. Check out Cookiecutter Data Science, DrivenData's standardized Python project structure, for more coding best practices.

Additional resources

The guidelines above are general rules of thumb. The resources below get into much more detail, and can help you implement additional open science principles that may be relevant to your submission.

  • Open Science 101 (OS101): A free, comprehensive, online or in-person training program to introduce scientists, researchers, and citizen scientists to the principles and practices of open science. OS101 covers key concepts, tools, and resources for how to create and share data, code, and results. To register for OS101, participants first need to create an ORCID iD.

  • Deon: An ethics checklist for data science projects created by DrivenData. This is especially relevant for addressing the "Integrity" evaluation metric.

  • Cookiecutter Data Science: A reasonably standardized project structure for doing and sharing data science work in Python, created by DrivenData. This is most relevant to the "Usability" evaluation metric.

  • The Turing Way: An open-source handbook for reproducible, ethical, and collaborative data science. This covers a very broad range of topics, only some of which will be relevant to this competition. For example, there is a handy guide to getting started with Github and advice for code styling and linting.

  • Guide to Open Science from NASA's Transform to Open Science (TOPS) Mission: A comprehensive guide to getting started with practicing open science. The guide covers a broad range of scientific practices, only some of which are relevant for this competition.


Open-ended challenges like this one provide an opportunity to be creative, but that flexibility to explore many different ideas can also make it daunting to get started.

Below we provide a few examples of the types of submissions that might be a good fit for this competition. As shown, competition submissions can range from simple to more technically complex. This list is just the tip of the iceberg, and we encourage you to get creative! You could even find the rest of that iceberg with the GRACE-FO dataset.

  • Monitor greenhouse gas emissions: Use AIRS data to create a line graph of emissions from a specific country over time, and help policymakers monitor progress on SDG 13 (climate action).
  • Increase crop yields for subsistence farmers: Combine climate data (HRRR) with soil moisture data (SMAP) to create plots of monthly growing conditions for an agricultural region, and help farmers decide what and when to plant.
  • Visualize harmful algal blooms in drinking water sources: Use Landsat satellite imagery to map algal blooms near a major city, and help public health managers ensure access to clean drinking water.

There are many more examples of real-world use cases in the data resources blog post.

Live tutorials

Subject matter experts will host live tutorials during the competition to help participants get started with a variety of tools and datasets. See the Announcements Page for more details and a recording of each tutorial.

Scheduled tutorials:

Good luck!

Good luck and enjoy this problem! If you have any questions you can always visit the user forum!