GreenEarthNet
GreenEarthNet is a machine learning-powered toolkit for predicting ecosystem responses to climate changes by leveraging Earth observation data and climate models. It supports effective environmental monitoring and forecasting for improved climate action.
Description
GreenEarthNet is a predictive tool designed to model ecosystem behavior under different climate scenarios by integrating advanced machine learning with vast datasets from satellite Earth observations, climate models, and field data. Developed on the foundation of research by Benson, Reichstein, and Robin, GreenEarthNet enables users to assess vegetation health, carbon fluxes, soil moisture, and other key ecosystem indicators at various scales. The tool is built with an emphasis on interpretable machine learning models, allowing users not only to obtain forecasts but also to understand the driving factors behind ecosystem changes. GreenEarthNet provides an interface that caters to both technical users (via APIs and downloadable code) and i splanned non-technical users (through an intuitive, map-based interface).
Potential User Groups
- Environmental scientists and ecologists studying climate impacts on ecosystems.
- Climate policy makers needing regional forecasts for policy decisions.
- Conservation NGOs interested in understanding ecosystem health and biodiversity.
- Agricultural stakeholders monitoring soil and vegetation conditions.
- Educators and researchers in climate science and ecology fields.
- International climate organizations working on anticipatory climate action.
Usage Guide
Users can access GreenEarthNet through two main avenues: (1, planned) a web-based graphical interface and (2) a Python-based API (www.earthnet.tech; https://pypi.org/project/earthnet-minicuber/). For example, to predict the future vegetation state in a specific region, users can input latitude and longitude coordinates through a graphical interface and select a weather forecast. The tool will then generate predictions for vegetation state over time, accompanied by visualizations that can be downloaded. Using the Python API, users could achieve the same by providing the coordinates and scenario parameters in code, allowing for integration with other data analysis workflows.
Availability: Information and access to data and code are available via https://www.earthnet.tech/ and https://github.com/vitusbenson/greenearthnet
Use case: The tool has been used for predicting dynamics in response to weather in Europe, including potential legacy effects from extreme droughts.
References: Vitus Benson, Claire Robin, Christian Requena-Mesa, Lazaro Alonso, Nuno Carvalhais, José Cortés, Zhihan Gao, Nora Linscheid, Mélanie Weynants and Markus Reichstein. Multi-modal learning for geospatial vegetation forecasting . In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2024.
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 101003469.
Tools
We contributed to the continuous development of the tigramite (https://github.com/jakobrunge/tigramite/) Python package for causal inference methods.
This package provides a wide range of constraint-based causal discovery and causal effect estimation methods.
Stochastic Weather Generation is a computationally light tool to simulate temperatures for worse case heat extremes, at city up to country level.
Importance sampling is a way to preferentially select out of a range of model simulations the ones that will lead to extremes of the metric of interest in an early stage of simulation, thus increasing computing efficiency for the cases of interest.
Storyline is a methodology to determine when in the future extreme heat events above a chosen threshold become likely in cities, and to present the meteorological conditions that lead up to it.
Ensemble boosting uses climate models to efficiently generate very intense and rare weather and climate extremes that can be analyzed for planning and stress testing of critical infrastructure.