What's Hot

"Risk Dashboards should serve the stakeholder" | Advanced Risk Dashboards

Sunday, March 13, 2011

A year of catastrophes and Extreme Value Theory

This year has been particularly negative with respects to natural disasters and the associated impacts that obtrude the livelihoods of those communities affected. Considering we are only a couple of months into the year, the number of catastrophes that have struck various countries around the world is quite astonishing. 

To list a few disasters alone, we saw Queensland which is normally a drought suffering state of Australia inundated with so much rain that mass flash flooding ensued, Christchurch was flattened by an earthquake (warning these images are disturbing) on the 22nd of February and only two days ago Japan suffered the fifth largest earthquake since 1900, along with a devastating tsunami where the estimated damage and loss is currently unknown.

From a risk perspective of disaster management, this article investigates methods for predicting catastrophes and their potential loss. 

We have also followed up with a second article on how institutions can cover the fiscal loss from these environmental disasters by using financially structured products. This posting can be found by clicking here.

Event Prediction and Extreme Value Theory
Anything that occurs frequently is easy to predict because plenty of data experiences can be captured to produce a tight curve fit.  However, extreme events are infrequent and the time span between each disaster can be long enough that gathering a suitable set of data points for a coherent analysis makes for an arduous task. Additionally, the relevance of specific data points becomes weaker as the model shifts overtime.  

Extreme Value Theory (EVT) helps us solve this problem.  EVT was originally pioneered by Leonard Tippett, a researcher in cotton who found the strength of a cotton thread was predicated on the weakest fibers and I fair the same principle applies to some organisations. All that aside, working with Ronald Fisher, Tippet was able to describe distributions of the extreme and the prodigy of these two statisticians ended up becoming the Fisher Tippett distribution.

Most importantly EVT helps us explain how tall to build a wall to keep the sea out or how strong to design a building to remain standing from a tremor. It is important at this point to note that building design and strength are far more complex dynamics than simply selecting durable materials during construction. None the less, if you needed to know how high to build a breakwater for floods, EVT can be used to answer that question. Jack King's book on operational risk put's it clearly.  
Using the historical maximum level of the sea is an obvious choice when building a breakwater but should the breakwater be only a few inches above the historical maximum or more?  And if more, then how much more?  | Jack King
In effect the problem is defined by the maximum height of the sea over period t for so many periods and the objective of the exercise is to estimate the number of occasions when a particular height will be exceeded by the ocean.

Jack King explains the answer clearly using a binomial distribution of Bernoulli trials to create a mean number of exceedances:
Where n is the periods in the sample, M is the ordered statistic and N is the design number of periods.  

In Jack Kings example; the problem might examine maximum yearly sea heights for the last 50 years, yet you only want to build the wall to capture an exceeded value over a 20 year horizon. This is a common choice because after 20 years the breakwater might need to be rebuilt anyway due to the effects of erosion. 

So how high should the wall be?
Well in this case you would move the formula around to find M and choose somewhere between the second highest and the highest value over the 50 year period. To be precise 51/20=2.55 or in English, you should be selecting a position of 2.55 on the maximum threshold scale for determining the size of the breakwater.

One of the most important advantages of EVT is that the approach encourages analysts to model the Mean Excess Zone, that is; excess plots or places in the extreme will generally have a tendency to follow asymptotic behavior and they form a shape parameter that can be extracted to allow us to better understand the expected experience from a specific extreme event.

A mean excess plot - Straight Line

In our chart example above, the shape parameter for the event data curve is below 1 for each high threshold and the mean excess in this case can be viewed as a random closed data set that converges in topology to a straight line.  Another study of a shape parameter might show cases where the mean excess function doesn't converge at all and the analyst will then know they are dealing with an entirely different type of event potential. 

Mean Excess Plots are used by analysts from alternate fields of risk to measure different numbers. Insurance underwriters will translate mean excess to the expected claim size and elsewhere a reliability statistician might use mean excess plots to dimension the expected residual life of a system or in short, how long will something last.

Another really useful study from modelling the extreme event gives us insight into whether the system is changing overtime.

[1] Are events becoming more extreme when back fitted to experience? 

[2] Are we overdue for the big one single event that changes everything?
Large earthquakes have rumbled along a southern section of the San Andreas fault more frequently than previously believed, suggesting that Southern California could be overdue for a strong temblor on the notorious fault line, a new study has found. 2009Jia-Rui Chong
More can be found on Extreme Value Theory here.

The Queensland Floods
In respect to the recent floods in Brisbane, this is not the first time the city has experienced such an event and minor flooding is a relatively common occurrence as the Major Threat Zone has been set well below the artificial 100 year scale threshold. 

The 100 year scale is at the top end of the Mean Excess Zone we discussed earlier in this article. It is also a place that should be measured and protected carefully or to put it another way, if you want to reduce damage outcomes while living through an extreme episode, you will need a strategy for the extreme.
The 100-year flood is more accurately referred to as the 1% annual exceedance probability flood, since it is a flood that has a 1% change of being equaled or exceeded in any single year.
The Bureau of Meteorology has depicted the flood tables in Queensland by putting the Major Threat Zone below the Mean Excess Zone and Brisbane experiences an average of five historical events per hundred years. With this in mind, we could deduct from an Extreme Value Theory perspective that Brisbane is high on the inevitability scale for another disaster. In the model, event clustering might play to the advantage of Queenslanders and against our argument however, unless specific restructuring occurs to the ecological design of the climate control systems in Brisbane, the potential for another disaster within 15 to 20 years is ever present.

This is diametric logic or counterintuitive, one would think to set the Major Threat Zone lower than the Mean Excess Zone is a wholesome thing to do and it should surely encourage people to be on alert.  The issue is however, it also drops the bar from where we really need to protect ourselves from.  In the world of Extreme Value Theory, a tail event is what we should be modeling for, not avoiding normal flood event episodes. 

Brisbane City Flood Data + 100 Year Scale [Bureau of Meteorology]

Interestingly the images of the Brisbane flooding in 1974 bare much of the same resemblance to those from the washout this year. Statistically speaking one would expect this phenomenon however, you can nearly guarantee the loss value will grow just as the city has and as Brisbane becomes more important so do the stakes.

An old Police car blocked by the 1974 Brisbane Flood

Given this backdrop of data, the Brisbane City Council unit for disaster planning should consider a specific action or strategy to minimize loss potential from this now "known event". As a unit, they should either prepare for future disasters or at the very least accept the potential future costs as part of living in a volatile climate.

1 comment: