Pages

What's Hot

Wednesday, December 21, 2011

How to use external data in operational risk

Under the Basel II operational risk Advanced Measurement Approach for quantifying operational risk losses, banks need to factor external loss data into their internal capital models. The argument is old and has been debated for years but that hasn't prevented the topic of external data resurfacing again and again with risk managers in financial institutions.

In this journal, we look at key practices for implementing external loss data in a banks OpVaR calculations.

Back in the days (around 2003 / 2004) when Europe, Australia and Japan were building their operational risk capital calculators, most large banks in these jurisdictions selected Advanced Measurement Approach for operational risk. It was a bold move back then and even more so when one considered the Basel II 674 requirement on banks having to use external loss data in their internal capital calculations. Back then, the 674 demand had neither charter nor source, that is; banks were struggling with finding a source of external data and also at a loss with what to do with it.

As we move forwards to 2011 this debate is on-going, this is not because the Aussies and the Japanese didn't solve the problem, they actually did. External loss data is becoming a fashionable topic for debate this time round because many banks in Asia are considering the move up the Basel II maturity curve or to be precise, risk teams are looking to take the lead into the Advanced Measurement Approach domain. In addition to what is going on in Asia and the Arabian Gulf, some US banks are now formally considering Basel II in an effort to catch up.

Four Common MethodsNow the regulators and the Bank for International Settlements never really gave directions on how external loss data was to be used, just that it had to feature in the bank's capital calculations. Years of deliberation have turned out four common methods as potential candidates for manipulating and injecting External Loss Data into the banks internal capital calculations.

There are many ways to skin a cat as the saying goes but in this article we are going to describe just four specific methods.

Four Methods Explained (click to enlarge)

[1] Infer Event Data

One of my favourite external data manipulation methods came to me from an associate who heads up the risk team in an Indian bank and is perhaps the most creative and possibly the most yielding approach I have taken note on as yet. Firstly external loss data is only used to fill gaps in the banks internal loss data matrix, this is very clean because it applies external data directly to where it adds value so to speak (pun intended).

Infer External Data (click to enlarge)

In effect you calculate the correlation relationship between all cells in a stratified external data matrix and then compare those correlation coefficients with the correlation coefficients for cells from the 8x7 internal loss matrix. Of course you only calculate correlation coefficients for cells that have sound data populations when looking at the internal data side.

None the less, in theory if you know the correlation coefficients between cells in the external 8x7 matrix and you know the differentials between the correlations of cells between the external and internal matrix, then you can bootstrap a loss probability distribution function for cells that have no loss experience in the banks internal matrix by using inferred internal loss data. It sounds convoluted and the slide above should simplify this text, pictures do as they say tell a thousand words and that couldn't be more true here.

[2] Apply Scenario AnalysisOur next option is really straight forward and its simplicity is part of its downfall because it pushes much of the modelling effort onto staff across the banks business units and very few banks have engaged this approach in a coherent or successful way.

So how does it work?

What the bank does is just capture extreme event information and present that in a scaled and non-scaled format to staff for inclusion in the banks scenario analysis program. In the short of it, the op risk team list external events, their causes, outcomes and their growth functions and, they then present this information to managers in a framed way so that the managers can assess and adjust these outcomes to suit potential scenarios for an internal case.

[3] Benchmark Hill Shape

The hill shape approach comes to us from the world of extreme value theory and fits well with banks that have focused heavily on LDA (loss data approach) for their capital modelling. Believe it or not, the Hills Index is a consistent estimate for fat tailed distributions [Mason-1982] and itself is a limiting distribution that is normally distributed with a set of ordered statistics. The hill index relies on the step size between extreme observations to extrapolate the behavior of the tails in the broader part of the distribution and becomes a good measure of tail loss.

This approach allows the risk analyst to compare internal loss experience at extreme positions with that same extreme position on external data and most banks will carry out this exercise. However, the approach doesn't resolve many problems. Firstly it doesn't correct missing data points in the internal loss data matrix and secondly it is based around the assumption that both data sets (internal and external) are derived from similar distributions. That may not be the case.

Given all of this, it is an interesting exercise to do at an aggregated level and comparing the hill estimate (assuming it is stable) between different organisations to assess how well the bank fits an industry norm, is probably an encouraging activity to entertain.

[4] Stratification and Smoothing a pooling process

Perhaps the most common method that has been selected by banks is that which is recommended by Nicolas Baud, Antonie Frachot and Thierry Roncalli. This is partly because they did this exercise so long ago and it became a bit of an accepted practice for using external loss data in operational risk capital calculations.

Pooling external and internal data does tend to lead to unacceptably high capital charges as one would expect but mostly this is due to the fact that external data is generally skewed to larger losses.

Baud, Frachot and Roncalli developed a statistical methodology to ensure that merging both internal and external data leads to unbiased estimates for the loss distributions and their paper which is linked here shows how this methodology can be applied in a bank.

No comments:

Post a Comment

Author

Martin Davies is a risk framework architect with strong domain knowledge across a diverse set of risk fraternities, a background in banking front-to-back and the ability to articulate business requirements into functional information technology concepts. He is focused on structured products for emerging markets and works with several tier one banks, regulators and brokerages across South East Asia.