25.07.2010

Scaling severity of events belonging to different periods

The challenge
The Basel Committee requires banks to base its operational risk measure on a minimum period of observation of 5 years[1]. However, if we use operational risk loss data, there may be some doubt on the stationarity of the time series of losses. Can we say, that each time, loss data are really stationary? Can we really say that the average value of loss data is stable or that the volatility (variance) of losses will remain constant? Suppose it is not the case, than depending if we take into account the period of increased/decreased volatility or mean will affect the choice of the severity distribution, the value of the parameters of the distribution and finally the value at risk results. In such a situation one may strongly argue against the usefulness, predictive value of the model and the supervisory authority may even reject the operational risk measure. On the contrary we have to take into consideration that extreme losses happen rarely since in a bank database the number of losses belonging to the potential tail of the severity distribution is fairly limited perhaps even to 1-2 observations. In such a case we may observe apparent nonstationarity caused only by the distribution of some extreme events.

In other words it is extremely important to take into account the change of potential severity losses in time but without eliminating or smoothing extreme observations.

The incorrect solutions
The first idea is to scale observation with the help of the CPI as proposed in numerous software packages. This solution is completely silly. There is no reason to see a link between the consumer price index changes and the rate of changes of severity damages in countries which are not suffering severe inflation. One cannot seriously argue that the average damages related to frauds, employment practices, damages to physical assets, business disruption and system failure, etc. change in the same way and rate. The damages to physical assets may have be affected by the inflation rate but not necessarily by the consumer price index as a good part of these assets are not bought by consumers but enterprises. The losses caused by unlawfull employment practices may also change due to law changes or at least to the interpretation of the law by courts. In any case a CPI deflator will probably have a negligible influence on the scaling in the majority of the developed countries.

The second idea is to make different adjustment for each event type or business line.
A very simple idea is to calculated the mean for each time period and to rescale each observation in a following way in order to have the same mean for all the periods:

where:


- represents th e rescaled time loss number “i” for period T-t to take into account differences of the mean

- represents the observed loss number “i” for period T-t

- represents the average loss for period T-t
-Represents the average loss for the last available period T



After that, it is also possible to rescale again in order to eliminate changes in the standard distribution.
Proceed the following way:



where:


- represents the rescaled time loss number “i” for period T-t to take into account differences of the mean
- represents the average loss for the last available period T

- represents the standard deviation for the rescaled time loss data

- represents the standard deviation for the last period loss data

- represents the rescaled time loss number “i” for period T-t to take into account differences in the variance


This approach will have a dramatic influence on the results depending on the relative value of the means and variances for different periods compared to the value of the mean and variance of the last period. If the value of the mean ( or variance) in the last period is the highest the scaling will probably cause an overstatement of VaR. Likewise, if the value of the mean (or variance) in the last period is the lowest, one may expect an unjustified reduction of the value of VaR. The reason is that, depending on the situation, the influence of extreme events will be maximized or minimized.

The second issue is that this approach cannot work well with truncated data. If the truncation point is changing it may cause a serious bias in our calculation.


The partially correct solution
The idea to make different adjustment or business line can be improved if instead of scaling means one uses more robust estimators of the location or dispertion of the data.

A simple way is for instance to replace the mean by the median:


where:


- represents the rescaled loss number “i” for period T-t


- represents the observed loss number “i” for period T-t

- represents the median loss for period T-t

- represents the median loss for the last available period T



This solution will permit to obtain data which have at least a common median all the time. The influence of extreme events is not minimized or maximized due to data transformation. However, one cannot guarantee that the variance is stable. Also truncation may be the cause of misleading results.


The proposed approach
The correct approach is not to scale operational loss data but to scale the value of parameters describing the operational environment. Suppose that we consider losses related to the process of banking settlements. What is obviously affecting the amount of individual loss is the amount of the transaction (unproperly) settled. By collecting data on the environment in our case amount settled transaction by transaction we can get a data base of potential losses related to settlements. As most of the transactions are settled without difficulty it is possible to obtain thousands of observations by periods. In such a way we are not constraint by the inexistence of extreme events in some period and their existence in others. On the contrary, we may argue that the number of huge settlements should be more or less constant during the different period. So calculating the average value of settlements for each period and the standard deviation for these periods as done in equation (1) and (2) will permit to rescale this data. In order to obtain loss data one must make an additional assumption about the stability of the loss towards the amount of the transaction, calculate the probability of a loss and the relative amount of the loss towards the amount of the transaction.[2] .

In order to obtain a parameter relating the size of the loss towards the size of the transaction simply calculate:


where:

- represents the average loss for period T-t

- represents the average size of the transaction which ends up with a loss




The probability of the loss for an individual data is:

n T-t / N T-t (5)

where

n T-t - represents the number of losses for period T-t

N T-t - represents the number of transactions for period T-t


In order to rescale the transaction we have also to calculate:


where:

- represents the rescaled volume of the transaction number “i” for period T-t in order to take into account differences of the mean

- represents the observed volume of the transaction number “i” for period T-t

- represents the average transaction volume for period T-t

- represents the average transaction volume for the last available period T
- represents the standard deviation for the rescaled transactions of the period T-t

- represents the standard deviation for the transaction for period T

- represents the rescaled volume of the transaction number “i” for period T-t in order to take into account differences in the variance


Now we can transform our transaction into potential losses:


In order to complete a Monte Carlo simulation we need to assess frequency. If we assess that the frequency may be estimated by a Poisson distribution with parameter λ we may calculate



which will permit us to model the frequency for the entire historical period.

A refinement of the model is to calculate not the current potential losses given an operational risk has occurred (as in equation 9) but the next period potential losses given an operational risk has occurred and to take into account changes in frequency or severity. This modification requires to model changes in time of the mean of the transaction and frequency of losses for instance through a time series approach.


[1] DIRECTIVE 2006/48/EC OF THE EUROPEAN PARLIAMENT AND OF THE COUNCIL of 14 June 2006 relating to the taking up and pursuit of the business of credit institutions (recast)

[2] Can we modify Loss Distribution Approach (LDA) in order to permit to any bank to model operational risk?

Brak komentarzy:

Prześlij komentarz