esa logo

AI4FoodSecurity (Germany)

Challenge closed

OverviewRulesPrizesDataTeamsLeaderboardForum

Data

Remote sensing is entering a new era of time-series analysis. Daily revisit times of satellites allow for near real-time monitoring of many areas across the globe. However, there has been little exploration of deep learning techniques to leverage this new temporal dimension at scale. This is particularly interesting in crop monitoring where time series remote sensing data has been used frequently to exploit phenological differences of crops in the growing cycle over time. Additionally, existing approaches have struggled to combine the power of different sensors to make use of all available information. To stimulate innovation in spatio-temporal machine learning, we have partnered up to propose a unique challenge centered around modeling vegetation phenology from very high cadence, harmonized time series of Planet Fusion and Sentinel Imagery. This is made possible by Planet’s new and unprecedented L3H pre-processing level that allows direct interoperability with Sentinel 2.

The proposed challenge focuses on crop type classification based on a time-series input of Sentinel-1, Sentinel-2 and Planet Fusion Monitoring data. The Planet Fusion Monitoring product consists of clean (i.e. free from clouds and shadows), daily gap-filled, high resolution (3m), temporally consistent, radiometrically robust, harmonized and sensor agnostic surface reflectance time series, featuring and synergizing inputs from both public and private sensor sources and directly interoperable with HLS (harmonized Landsat Sentinel) surface reflectance products. The Planet Fusion data is provided in two alternative versions: daily and with a 5-day cadence (a light-weight version composited by applying a median filter to the daily product). Participants have the option to use either. More detailed information about this new data source and accompanying QA layers that could be exploited in the process of developing ML models can be found here.

The challenge covers two areas of interest, in Germany and South Africa, with high-quality cadastral data on field boundaries and crop types as ground truth input. The datasets present two main challenges to the community: exploit the temporal dimension for improved crop classification and ensure that models can handle a domain shift to a different year.

The challenge consists of two tracks:

  1. Within-season crop identification, over the South Africa AOI.
  2. Reusability of models for crop identification from one growing season to the next, over the Germany AOI.

In the first track participants are provided with a set of time series representative of all sensor modalities and covering five different winter crop types throughout an entire growing season (April-November) and matching GT in the form of crop IDs and field boundaries for the purpose of training and validating ML models. The evaluation will test the ability of the models to correctly identify the crops from test data consisting of time series of images covering the full (April-November) growing season and acquired in the same year and geography.

In the second track participants are provided with a set of time series representative of all sensor modalities and covering nine different crop types for two different years (full 12-month time series) but in the same geography. Field boundaries are provided for both years. Crop ID labels are provided for one of the two years only. The goal is to train and test ML models that can handle a time shift to a different season. The evaluation will test the ability of the models to correctly identify the crops from the test data from the year for which crop ID labels were not provided.

Participants are not required to participate in both challenges. However, the evaluation mechanism behind both tracks are the same, as well as the rules and prize catalogue.

The full overview of the data and metrics can be found in the Jupyter notebook.

Privacy PolicyTerms & ConditionsCookies

AI4EO is carried out under a programme of, and funded by the European Space Agency (ESA).

Disclaimer: The views expressed on this site shall not be construed to reflect the official opinion of ESA.

Contact Us

Contact

Follow Us