Fermer ce champ de recherche.


Every city authority in the world is faced with tough decisions on how to deal with present and future losses due to changing likelihoods and intensities of extreme weather. It is relevant to identify crucial entry points to work with stakeholders. 


Between 4 and 11 May 2024, we held four workshops for stakeholders. The workshops were organised around four themes: Cities, Legal, Energy and Agriculture.


Speakers: Marlene Kretschmer (University of Leipzig, University of Reading), Fredi Otto (Imperial College London) and Davide Faranda (IPSL-CNRS).

#1 – Method of causal inference for attribution

Speaker: Marlene Kretschmer (University of Leipzig, University of Reading)

Title: Causal inference for attribution


The first presentation focused on causal inference for attribution. The key approach is the storyline approach. The aim is to account for the multiple causal factors underlying extreme events in a physically coherent way, and to account for dynamical conditions and regional warming (both of which may depend on global warming), as well as vulnerability and exposure when assessing impacts.

Causation differs from correlation but we can not tell the causal chain simply by looking at observations. The data do not speak for themselves. We can use climate models to make interventions, but these climate models do not always accurately represent reality. Causal inference is a rich mathematical framework for understanding these relationships in the data. It allows us to formally include expert knowledge in statistical analysis, and thus infer causal effects.

An example is van Garderen et al. (2021) on the Russian heatwave, using nudged storylines. Storyline approach is complementary to the probabilistic approach. XAIDA attempts to combine these approaches to link impacts to known large-scale drivers such as the QBO, ENSO, MJO, polar vortex; this enables the role of natural and human factors in extreme events to be quantified. 



Which of these different approaches might produce the most confident attribution/causation statements for legal use?

This is the key question for the lawyers. Confidence depends on the question that you’re asking, some methods might be more useful than others in certain cases. But do not focus too much on trying to pick ‘the best method’ – we can be more confident if we use several complementary methods and they come up with similar results.

#2 – Operational services: World Weather Attribution 

Speaker: Fredi Otto (Grantham Institute for Climate Change and the Environment, Imperial College London)

Title: Rapidly attributing extremes events & their consequences where it matters



The first operational attribution method presented was the World Weather Attribution method on rapidly attributing extreme events and their consequences.

Language matters: we do not ask “Was this climate change?”, but “How has climate change altered the probability and intensity of this type of extreme event?”. We need to realise that results of an attribution study are only valid for the specific event defined: we will get a slightly different answer for different event definitions.

This is illustrated by three aspects of the Madagascar heatwave study, where all three event definitionis included 1) October mean heat over the region, 2) 7-day night-time temperatures in Antananarivo, and 3) October mean heat per grid cell. The conclusions are qualitatively similar, but the exact numbers are different. It is also important to understand that sometimes we do not have the data to carry out a study – and also to remember that sometimes we do not find that climate change was a significant contributor (for example, the Black Sea storm study, where the statistical analysis encompassed the possibility of no change in precipitation, but due to our physical process understanding we expect a small change).



What types of impacts have well-defined macroeconomic damage functions in which we can have confidence? Can we quantify the damage from individual events VS extreme phenomena in general?

For heat extremes, we are really confident in the sign of the change, but the exact quantification can be variable. If your damage function is sensitive to the exact value, then it will not be very reliable. In this case, we might use a lower bound just to increase confidence. For precipitation, the numbers do not change that much, so we can estimate a range of outcomes, albeit with higher uncertainties. It should be noted that for crops, for example, these functions are well-defined and therefore a good candidate for impact attribution.

To what extent are event definitions objectively defensible? Is it something that could be attacked in a legal setting?

It is not easy – and ‘objective’ is always a tricky term. the World Weather Attribution always chooses the definition of the event according to the impacts. So, if there is a well-defined area where, for example, extreme rainfall and subsequent damages occurred, this definition is quite defensible. But often there are antecedent conditions. The important thing is to be transparent about how events are defined. We can also examine several event definitions to test sensitivity: this is the most vulnerable part of the whole process. For storyline it is different, there is no subjectivity in the storyline approach, it just looks at the exact event as it actually happened. The two approaches are highly complementary.

Can we go back to comparing probabilistic vs storyline approaches, especially wrt the Russian heatwave: the storyline approach indicated 2℃ of warming, probabilistic indicated 1℃. It would be useful to break down how the two approaches differ

The storyline approach considers the exact event, with those circulation patterns. In the probabilistic approach we define the event based on the impact (in this case, mean July temperatures) and don’t condition on the circulation pattern: so the probabilistic approach is talking about July temperatures generally. Also, the storyline approach concluded ‘up to 4dc’ at the peak of the heatwave: the probabilistic approach was looking at a larger-scale event, and we expect to find less extreme extremes when we take a broader spatial/temporal event definition (which is necessary in order to ensure that we have enough data).

Useful links: 

#3 – Operational services: Climameter

Speaker: Davide Faranda (IPSL-CNRS)

Title: ClimaMeter: Putting Weather Extreme Events in a Climate Perspective



The second operational attribution method presented was Climameter. ClimaMeter is closer to the storyline approach but also uses the idea that we can look at changes in a class of similar events. At the end of the last decade, confidence in attribution studies was low for many events – we could really only attribute extreme cold and heat with any confidence. Nowadays, there are three steps in attribution: define the event, define factual and counterfactual, and evaluate the role of natural vs forced variability. Attribution conditioned on circulation patterns can provide additional spatial insights on local changes: changes are not uniform over the domain. The probabilistic approach is to work from impacts to climatic hazards: ClimaMeter goes from climatic hazards to weather conditions. ClimaMeter mainly looks at rain, but also some heat/cold and wind events. It has covered 40 events in 6 months, most of which have been influenced by climate change in some way. The goal is to communicate with the press. Note that at the moment we don’t use climate models at all, only reanalysis, because the aim is to produce a very rapid study. Events are defined according to patterns in surface pattern, and report changes in temperature, precipitation and wind from 1979-2000 (‘not much affected by climate change’) vs 2000-now (‘strongly affected by climate change’). The dominant phases of ENSO, PDO and AMO in the two periods are also considered. We can also understand how unusual the event was, based on the number of high-quality analogues found. Future work is to include climate model projections for full attribution, and also to extend the approach to include impact attribution.



Is it possible for the finance/insurance sector to use this tool/dataset?

The MSWX data is freely available. ClimaMeter can provide the dates of analogues identified to the events studies and offer to study events (with fees for private sector collaboration). The protocol paper is currently in preprint: the events will probably be published in EGUsphere. We indicate that ClimaMeter (as well as WWA and causal inference) will also be introduced in the XAIDA summer school.

How could the method be used to look at impacts? This is really important in a legal context.

You can have a look at the Venice study on the ClimaMeter website. They evaluated the effectiveness of the flood barrier, using flood damage models. The city is still not well prepared for some events – not all cyclones behave in the same way, the impacts are different depending on the characteristics of the event (eg. different types of cyclones, dry vs moist heatwaves).

How have you tested the sensitivity of the results to different observational sites? We’ve found that MSWX can behave quite oddly.

Studies are repeated using ERA5 to check the sensitivity to the dataset. But we didn’t look at the earlier time series – results are sensitive to the time period considered – other groups/people are also looking at this in a slightly different and systematic way.

Useful links: