Fermer ce champ de recherche.


Every city authority in the world is faced with tough decisions on how to deal with present and future losses due to changing likelihoods and intensities of extreme weather. It is relevant to identify crucial entry points to work with stakeholders. 


Between 4 and 11 May 2024, we held four workshops for stakeholders. The workshops were organised around four themes: Cities, Legal, Energy and Agriculture.


Speakers: Erich Fischer (ETH Zurich), Robert Vautard (IPSL-CNRS) and Pascal Yiou (CEA).

#1 – Ensemble Boosting

Speaker: Erich Fischer (ETH Zurich)

Title: Unprecedented extremes – Anticipating the inconceivable

Abstract: The key question in Erich Fischer’s presentation is whether we can anticipate these record-breaking events in advanc (a true “worst-case” scenario). Data being limited, we need to use large ensembles of climate model to simulate such extremes. By adding tiny perturbations to existing simulations of moderate heatwaves, we can obtain alternative scenarios that include more extreme events. 

Translating these unseen events to the local scale remains a challenge. This is probably appropriate for heat waves and sometimes for winter precipitation, but for summer convective precipitation, this resolution is not fine enough. We can use Bayesian adjustement and quantile mapping methods to downscale to the city level. 



How can other users use these tools ? 

Currently, the tool is not easy to use, which is why XAIDA is very interested in collaboration. We are currently looking to quantify mortality during heat waves, with boosted worst-case scenarios used as inputs to epidemiological models; we are also modeling multi-year droughts. We are asking stakeholders for their ideas so that experiments can be better targeted. But this is currently a co-development project: it is not possible to produce huge generic data sets on worst-case scenarios.

How long does this take to run? 

Surprisingly quick! We start with 1000 model-years of simulations, and once an event has been identified in this set, resampling can be carried out within 10 days to 2 weeks. The resolution is, however, rather coarse: we are currently considering downscaling the events using a convection-permitting model. The computational gain from pre-selecting the most severe events is even larger for higher-resolution simulations.

How can we evaluate these models to establish trust in their outputs? 

Process understanding is essential: we would only have confidence in simulation if we understand the driving processes correctly. Typically, these unseen events do not arise from unseen processes, but rather from unusual combinations of well-known processes. In addition, numerous model evaluations have been carried out in most of the distribution. Finally, combining with other lines of evidence is also very important: if different approaches giv similar results,  this suggests that these could be the kind of events we should be anticipating and planning for.

Useful links: 

#2 – Storyline: Paris 50°C

Speaker: Robert Vautard (IPSL-CNRS)

Title: Hard-case storyline: Paris under 50°C

Abstract: This work grew out of a workshop in 2021 between scientists and the City of Paris to revise the Climate Adaptation Plan. 50°C occurred at the same latitude as Paris (British Columbia), prompting the question: could 50°C happen in Paris? This motivated a crisis exercise involving fire services, police, hospitals, power companies, etc. It reflects the need to adapt to what we anticipate, not what we have already experienced. Translated into the physical sciences, the key questions are: is it really possible? When? Under what conditions?

Using  a selection of CMIP6 models, comparing temperature distributions over the period 1995-2014 (without bias-correction) and taking into account the urban heat island (UHI), we find that first events are identified at mid-century, with >2°C warming and appear even in SSP245 scenarios. We show that the event does not appear in isolation, but is part of a sequence of heatwaves, with likely consequences for health and water requirements. Minimum temperatures reach above 30°C. We note that this is a very general method, easy to implement elsewhere.



Can you share your experience of how the City of Paris is using this information to prepare?

This information has been taken into account in the revised resilience plan (which was the impetus for the work) – there is an increased awareness that such high temperatures could occur in the future. 

What kind of requests relating to unseen events do you get?

We receive a number of questions from a range of stakeholders, mainly concerning critical thresholds for activities and industries. It depends on the activity, for example critical temperatures for power lines.

Useful links: 

#3 – Importance sampling and stochastic weather generation

Speaker: Pascal Yiou (CEA)

Title: Simulating unprecedented climate events


Abstract: The third presentation dealt with unprecedented climatic extremes, and was prompted by the European heatwave of 2003, wich had a return period of over > 10^4 years at the time. But the probability of such an event increases by a factor of 10 every decade. In XAIDA, we have developed a number of strategies: boosting (presentation 1) and the statistical Stochastic Weather Generator (SWG).

The general characteristics of SWGs are: simple AI, flexibility, low cost, physical validation, random sampling of atmospheric models simulating the hottest temperatures. Based on CMIP6 simulations, it is possible to exceed the 2003 record by 2°C before 2050. This result is consistent with other modelling approaches.



In insurance, the frequency of extreme events over the next 10-20-40 years is important to accurate risk assessment: is it possible to access the frequency of such extremes?

We have not touched on the question of probabilities: it is difficult to estimate the probabilities. Most formulas are based on large-deviation principles (physics) / asymptotics (statistics), which apply when you have infinite samples but are not very accurate in finite samples, with huge error bars on very small numbers.

These uncertainty estimates are even larger when we are talking about probability ratios, making the estimates largely uninformative. There are methods for doing this, but we could estimate the return period of these events. However, they would only apply to the  world of models, so we would only be confident if we saw several methods giving similar estimates.

Nevertheless, in the insurance field,  we may be more interested in moderate events (30-year) than the most extreme ones: in this case, we might be able to provide more useful information. That is why it is so important to combine observations with models, and also to take into account the fact that we are in a warming climate.

Here we presented a case study on high temperatures, but this was also done for cold temperatures: which variables would be most interesting/relevance for stakeholders? For example, temperature inside buildings can be calculated, high precipitation (although this requires downscaling approaches), etc.

Minimum and maximum temperatures on a given day and over several days are relevant. And of course, precipitation on different timescales.

Useful links: