Predicting Extreme Events

By: R Ramesh
With drastic alterations in climate anomalies, there is a need for adequate mitigation and preparedness computations that can help reduce the impact of environmental disasters. Scientific predictions and planned methodologies can help, although it is yet not possible to completely safeguard against the unpredictable and uncontrollable forces of nature.
Planning n Mitigation

Any disastrous event such as a plane crash, a train accident, a powerful earthquake or a cyclone, stock market crash, a young person suffering a heart attack, collapse of a sea wall or bridges over rivers, and even mass extinctions such as that of the dinosaurs around 65 million years ago, are events that occur with extremely low, but finite probability. Such events are traditionally known to follow, as does radioactive decay, the statistical distribution known as the Poisson Distribution. These are now known in the scientific literature as ‘Extreme Events’ or in short ‘Xevents’. Any observant person might have noticed that in the recent past extreme events seem to be occurring more frequently. For example, the cyclone Katrina over USA a few years ago; unprecedented rain over Colaba in Mumbai with 94.4 cm rainfall in just 24 hours (26 July 2005), a very significant fraction of the annual total, which even prompted the British Broadcasting Corporation (BBC) to make a documentary on the event; the cloud burst in Ladakh (5 August 2010) around midnight whereby a catastrophic mud slide wiped out hundreds from lower Leh and Choglamsar areas. Then again, floods in Bihar occur every other year with devastating regularity. Scientists are trying to understand and predict such Xevents well in advance. This essay is an attempt to give an essence of such efforts to the generalist reader interested in science.

Article 3 Image 1
The breach of river Kosi resulted in flooded roads.

 

Methodologies to Compute Extremities

An important theorem in statistics – Central Limit Theorem, states that regardless of the underlying distribution that the occurrence of an event follows, large samples of such events would follow the Gaussian (or normal) distribution. It has a mean value µ (Greek letter pronounced as ‘mu’) and a standard deviation  (sigma). For example the mean monsoon rainfall over India is 90 cm (µ = 90) and its standard deviation is 10 per cent ( = 10 per cent of 90 = 9). The probability that the value falls between x and x+Δx (where Δx is very small) can be calculated using the Gaussian distribution. For example, the probability that the rainfall in a particular year is between 90 and 91 cm (here, Δx = 1 cm) is given by [1/][1/(2π)0.5]exp(-0.5(0.5/9)2], which gives a value of 0.003! The probability of having a monsoon rainfall between 81 and 99 cm (one standard deviation on either side of the mean) is ~68 per cent. We call it a drought year if the rainfall is less than 81 cm and a flood year if it is more than 99 cm (one standard deviation on either side of the mean) is ~ 68 per cent. Here, Xevent would be like the rainfall at Colaba in 2005, i.e., 94 cm in one day. The monsoon lasts for 3 months, as the total number of rainy days during June to September is roughly 90. Therefore, the ‘normal mean’ rainfall on a single day is expected to be 1 cm. Therefore, the probability of this extreme event, if calculated according to the Gaussian distribution, works out to be close to zero. Scientists then discarded the Gaussian distribution as not good enough to treat Xevents and newer methods were developed – now known as Extreme Value Theory. One such new distribution developed to handle Xevents is the Gumbel distribution, developed by Dr Gumbel, who studied the Nile river floods. There were some others too – the Weibull and Frechet distributions.

H J Fowler and colleagues in 2009 computed when the changes in extreme precipitation due to climate change would become detectable. They used climate models from the BBC Climate Change Experiment (CCE) over the UK. They fitted non-stationary Generalised Extreme Value (GEV) models to the climate model outputs and compared those to stationary GEV models fitted to the parallel control model experiments. They defined the time of detectable change as the time at which they would reject a hypothesis at the  = 0.05 significance level that the 20 year return level of the two runs was equal. They found that the time of detectable change depended on the season indicating that change to winter extreme precipitation may be detectable by the year 2010 and that change to summer extreme precipitation was not detectable even by 2080. Unfortunately, the climate model simulated extreme precipitation had quite a different behaviour to that of observations, perhaps due to the negative estimate of the GEV shape parameter, unlike observations which produce a slightly positive (0.0–0.2) estimate.

If the climate of the earth changes in the future, say – it warms due to anthropogenic activities, climatic and hydrological extremes, it could have a devastating effect on the economy of populous and poor Asian countries. The Intergovernmental Panel on Climate Change (IPCC) scientists are keenly interested in assessing how climate change has already affected and will continue to affect the characteristics of extreme weather events. In the Northern Hemisphere, studies have predicted increasing trends both in the mean rainfall and its extremes; climate models suggest that these trends may continue under the global warming scenario. However, all questions about changes in extreme weather at more local scales and the causes of observed changes are yet to be satisfactorily answered. Though there is significant human influence on water resources, attribution of extreme precipitation trends to human influence is not certain. However, it appears likely that extremes in rainfall will be detectable at smaller spatial scales and, perhaps, within the next 50 years. Some think that changes in moderately extreme rainfall are more robustly detectable than changes in the mean rainfall; primarily because, as precipitation increases (a warmer atmosphere can hold more water vapour – according to the Clauisus-Clapeyron equation every degree celsius increase in temperature leads to an increase of 7 per cent in the saturation humidity of the atmosphere) a greater proportion of rainfall is expected to come in Xevents.

As Einstein seems to have said, it is difficult to predict, especially about the future. It is indeed difficult to obtain at present observations on future climate conditions, so we rely on what is known about the earth’s potential response to altered atmospheric conditions based on climate models. Such model output is deterministic, that is, given the same initial conditions, the model output would remain the same. However, because of the chaotic nature of the models and their sensitivity to initial conditions, statistical models can be used to analyse the output in much the same way as they are used to analyse observed weather data.

Using model outputs for decision making has its own problems. As they have a relatively coarse spatial resolution (e.g. typically 2.5O latitude by 2.5O longitude), they cannot resolve significant sub-grid scale features, such as mountains and in-cloud processes. Models may simulate quite different responses, mainly due to the way some processes and feedbacks are modelled. How different model parameterisations affect the nature of extreme precipitation in the model output need to be carefully investigated. Due to their coarse resolution, models may not be expected to represent extreme rainfall events as observed in real time.

The Extreme Value Theory is an emerging branch of science and have other applications besides climatology. For example, this theory may be useful in the study of disaster management, mountainous hazards such as landslides and avalanches, freak ocean waves, and Xevents in human brain such as epilepsy.

 

Leave a Reply

Your email address will not be published. Required fields are marked *