In our last blog posting on Monte Carlo and Loss Data we described the importance of the Loss Data exercise. A few people have personally emailed me asking for more information on this aspect of risk management, so I have decided to write a blog post on it.
I will be posting two articles on the risk function around loss data specifically. In this post we look at what comprises a Loss Database and the event management process for administering Loss Data itself. In a second posting, I will describe the types of statistical models we can use to carry out analysis of the data we capture in our Loss Data repository.
The Loss Database
The Loss Database is one of the most important and fundamental aspects of a risk management service and it should feature in all sound Credit or Operational risk systems. The ability to accurately measure outcome from hazard can really only occur when we have a well-greased Loss Database that is actualizing experience.
Market Risk and Asset Liability Management tend to capture potential exposure in a different way and consequently they don't specifically use a Loss Database as we know it. For these risk disciplines, each trade is Mark-To-Market and thus continually at risk against the volatility of market prices. Additionally, the internal correlation of the price any trade has with other deals in the portfolio can create harmonising and marginal offsetting exposure. This correlation needs to be modeled in its own effort. These inherent and alternate requirements from the market risk camp will lead the data modeler into a positional exposure system rather than to a Loss Database facility per say.
All this aside, operational risk is our main focus today and you can't really quantify exposure without a Loss Database. As I stated in the previous blog Link], Control Self-Assessments and Scenario Analysis are very powerful tools for estimating downside or upside as it may happen, but they don't capture actual outcome. Real life experience is normally tracked in the Loss Database.
In a Loss Data facility, the types of information that would be gathered from a unique unwanted hazard have been listed below. Amazingly, there is more data to fetch than one would first conceive and each field tells us a different story on our historical loss experience.
Across the domain of banking, most institutions are mandated to record and report their losses. In other industry sectors however, the lack of a coherent loss reporting service is perhaps one of the biggest areas of risk management that needs substantial improvement.
In the past, the trend to purchase an off the shelf loss management system for Loss Data management was quite popular but of late, there has been an increasing desire for risk departments to build their own Loss Data management service. I have even seen some outstanding Excel spread sheets which can be used to track Loss Data and at the very least, prototyping in Excel can be a very good place to begin with when you are conceptualizing your new Loss Data Service.
Excel, an intranet or even a mobile phone application which tracks losses will inevitably lead to the generation of a lot of data. If you want to model risk in a consistent manner, then this data needs to be stored in relationship database such as the one I have shown below.
The SQL server database I am showing here allows for quite rich Pivot Table reporting to be developed and losses can be represented in the context of business units, products, control-failure and event categories. The type of delineated reports we can create is actually very diverse, but I find that sitting down with management and deciding how they wish to see their actualised event horizon seems to drive out the best buy-in across the enterprise.
For completeness sake in this post, I have included a sample and simple workflow process for resolution of losses. This should also be extended to include other aspects of the operational risk framework but I have omitted these additional areas in the policy workflow to ensure that we stay contextually relevant.
So there we have it, in brief, a Loss Data Reporting service for risk management. Here is an additional link which is definitely worth a peruse:
|| An internal loss data program for a global organisation [ Link ]
For those practitioners who are attending the Paris ISO 31000 conference on the 21st & 2nd of May, I look forward to meeting you.
Market Risk and Asset Liability Management tend to capture potential exposure in a different way and consequently they don't specifically use a Loss Database as we know it. For these risk disciplines, each trade is Mark-To-Market and thus continually at risk against the volatility of market prices. Additionally, the internal correlation of the price any trade has with other deals in the portfolio can create harmonising and marginal offsetting exposure. This correlation needs to be modeled in its own effort. These inherent and alternate requirements from the market risk camp will lead the data modeler into a positional exposure system rather than to a Loss Database facility per say.
All this aside, operational risk is our main focus today and you can't really quantify exposure without a Loss Database. As I stated in the previous blog Link], Control Self-Assessments and Scenario Analysis are very powerful tools for estimating downside or upside as it may happen, but they don't capture actual outcome. Real life experience is normally tracked in the Loss Database.
In a Loss Data facility, the types of information that would be gathered from a unique unwanted hazard have been listed below. Amazingly, there is more data to fetch than one would first conceive and each field tells us a different story on our historical loss experience.
The Loss Data Field Set [click to enlarge]
Across the domain of banking, most institutions are mandated to record and report their losses. In other industry sectors however, the lack of a coherent loss reporting service is perhaps one of the biggest areas of risk management that needs substantial improvement.
In the past, the trend to purchase an off the shelf loss management system for Loss Data management was quite popular but of late, there has been an increasing desire for risk departments to build their own Loss Data management service. I have even seen some outstanding Excel spread sheets which can be used to track Loss Data and at the very least, prototyping in Excel can be a very good place to begin with when you are conceptualizing your new Loss Data Service.
Excel, an intranet or even a mobile phone application which tracks losses will inevitably lead to the generation of a lot of data. If you want to model risk in a consistent manner, then this data needs to be stored in relationship database such as the one I have shown below.
The Loss Relationship Database [click to enlarge]
The SQL server database I am showing here allows for quite rich Pivot Table reporting to be developed and losses can be represented in the context of business units, products, control-failure and event categories. The type of delineated reports we can create is actually very diverse, but I find that sitting down with management and deciding how they wish to see their actualised event horizon seems to drive out the best buy-in across the enterprise.
Event Management Procedure
Now a good loss data service is not all about data, reporting and modelling. It should also include a policy like workflow which clearly outlines the roles and activities required by staff to process and resolve a new risk event.For completeness sake in this post, I have included a sample and simple workflow process for resolution of losses. This should also be extended to include other aspects of the operational risk framework but I have omitted these additional areas in the policy workflow to ensure that we stay contextually relevant.
The Loss Incident Activity Framework (click to enlarge)
So there we have it, in brief, a Loss Data Reporting service for risk management. Here is an additional link which is definitely worth a peruse:
|| An internal loss data program for a global organisation [ Link ]
For those practitioners who are attending the Paris ISO 31000 conference on the 21st & 2nd of May, I look forward to meeting you.
This is interesting, clearly important for risk managers and I see the article has three parts to it:
ReplyDelete1.What is captured in the loss database
2.How this is stored and reported
3.The policy of controlling or resolving a risk event
thanx
i think this is a complete professional work plan for a better event management system.i am hearty thanks to you causal capital
ReplyDeleteWe are an Advanced Event Management Company and we provides best profesional Dubai Web Design