Rieks Joosten has recently released an excellent paper and concept for assessing networked risks which has spawned on a debate for measuring such risks on the G31000 forum.

**Modeling Networked Risks**

Rieks Joosten's paper [LINK] is conceptually interesting and well thought through, certainly a credible area of risk management to be researching solutions for and, the "concepts for managing risks in complex systems" needs more attention from the practitioner commcunity. In my opinion, managing risk in complex environments is one critical domain of any 'risk charter' where the craft of enterprise risk will eventually come of age, well one does hope so that is.

My only key criticism with the paper is the following:

The publication overall is wonderful, we just need to move the mathematics into the realm of a stochastic system and away from deterministic numerical operations. Everything is grand overall with the publication, all is good with the world of risk until the paper attempts to perform a risk assessment in section 5.3. The diagram on page 22 is not mathematically coherent and needs to be replaced with a much more rigorous stochastic measurement technique.

G31000 - Networked Risk Management | Martin Davies

In defense of my remarks, Rieks Joosten wrote:

The majority of people seem not to be proficient in such techniques. In our experience, they often assess risks in an artisan fashion - they do estimate the height of a risk, yet cannot provide 'scientific' (or mathematical) underpinnings.

G31000 - Networked Risk Management | Rieks Joosten

This is one of the reasons why risk management is far from straight forward to measure and when people say to me 'Keep It Simple' in the context of risk management, they are also generally announcing to the world that they are at best an artisan and worst a layperson.

At times it is advisable for this vast "majority of people" or the masses, to handover their requirements for quantifying risk to a trained practitioner. If one moves away from the risk management fraternity specifically, that is how it works elsewhere in the world of engineering, science, auto mechanics, aviation, farming and so on.

If you are sick you can attempt your own assessment or go and see a trained physician for a structured diagnosis. If you need to draw up plans for a building, you can do this yourself with a notepad and a handful of crayons or you can pass your detailed requirements to an architect. This brings us onto the wonderful remark with respects to artisan vs the mathematician: A century old debate for what it's worth and one that is likely to be with us for a long time [LINK].

**Deterministic vs Stochastic**

The problem with measuring risk is that the assessor is 'dimensioning' something that is probabilistic in nature and that requires stochastic techniques. You can't add up risk or use simple mathematical operators (+ - x /) because risk may or may not actualize in the future ~ risk doesn't physically exist and when it does, it is an effect of something that is at best predetermined by a stakeholder and done so in the past.

Axiom I : Risk is the effect of uncertainty.

Acknowledging this primary axiom will inevitably lead us to into a subsequent dilemma that risk systems are inherently stochastic in their nature and the "network risk management" paper will consequently need to borrow from mathematics that is beyond deterministic operations.

Axiom II : Risk can't be coherently represented in deterministic ways.

Additionally, if there is a network of factors operating to create an uncertain outcome, it follows that any single or combination of factors can create feedback loops that affect or correlate or have co-variant inter-dependencies with other factors and outcomes. This inter-dependency creates a net risk outcome effect that is a greater or lesser than what was individually assessed and 'added up'.

Axiom III : Networked Risk Factors may have potential correlated inter-dependencies that have to be modeled.

It is a bit of a pickle but these three axioms leave us with a handful of options when presented with the requirement to model a network of risk factors.

[1] We can use numerical operations which give single deterministic results that are not coherent and consequently misleading. These simple calculations lack commercial rigour.

or

[2] Do nothing, just call a group of risks or type of uncertainty as high, medium, low.

or

[3] Apply models that create at trajectory or range of results that are derived from a stochastic system that is sensitive to correlated inter-dependencies and presents its results with confidence levels.

All this aside, that doesn't mean to say the logic, importance or work around Rieks Joosten's "Network Risk Management" system isn't outstanding and moving in the right direction because I believe it is.

**ISO 31000 Notes**

There are several references to networked risks in ISO 31000 but two specific areas that come to mind in the context of risk assessment would be section 5.3.5 and section 5.4.4 of the standard.

In

**ISO 31000 5.3.5**it states: "whether combinations of multiple risks should be taken into account and, if so, how and which combinations should be considered".

In

**ISO 31000 5.4.3**it states: "An event can have multiple consequences and can affect multiple objectives".

The study of all of this will move us into the realm of networked of factors interacting in multidimensional ways and that is what the paper "Networked Risk Management" endeavors to explore.

When quantifying risk (uncertainty) in Complex Multidimensional Interdependent Environments, some of the models worth exploring may include:

[] Bayesian networks

[] Markov Chains (with memory) simulated Markov Chain Monte Carlo

[] Binomial trees or fault tree analysis

[] Partial least Squares Path Dependent Modeling

[] Random Forests

[] Latent Trait Modeling

[] Distribution Based Clustering

[] Agent Based Modeling

(Frequency) x (Magnitude) = risk in complex environments, in any risk measurement is flawed and should be avoided.

## No comments:

## Post a Comment