**1. Introduction**

Financial institutions need to carefully manage financial losses. For example, the claims made against short-term insurance policies need to be analysed in order to enable an insurance company to determine the reserves needed to meet their obligations and to determine the adequacy of their pricing strategies. Similarly, banks are required in terms of regulation to set aside risk capital to absorb unexpected losses that may occur. Of course, financial institutions are more interested in the total amount of claims or the aggregate loss occurring over one year in the future, than the individual claims or losses. For this reason, their focus will be on what may happen in the year ahead rather than what has happened in the past. Popular modelling methods involve the construction of annual aggregate claim or loss distributions using the so-called loss distribution approach (LDA) or random sums

method. Such a distribution is assumed to be an adequate reflection of the past but need to be forward looking in the sense that anticipated future losses are taken into account. The constructed distribution may then be used to answer questions like 'What aggregate loss level will be exceeded only once in c years?' or 'What is the expected annual aggregate loss level?' or 'If we want to guard ourselves against a one in a thousand-year aggregate loss, how much capital should we hold next year?' The aggregate loss distribution and its quantiles will provide answers to these questions and it is therefore paramount that this distribution is modelled and estimated as accurately as possible. Often it is the extreme quantiles of this distribution that is of interest.

i.e.*N* � *Poi*ð Þ*λ* . Note that one could use other frequency distributions like the negative binomial, but we found that the Poisson is by far the most popular in practice since it fits the data well. Furthermore, assume that the random variables *X*1, … , *XN* denote the loss severities of these loss events and that they are independently and identically distributed according to a severity distribution *T*, i.e. *X*1, … ,*XN* � *iid T*.

*Construction of Forward-Looking Distributions Using Limited Historical Data and Scenario…*

aggregate loss distribution, which is a compound Poisson distribution that depends on *λ* and *T* and is denoted by *CoP T*ð Þ , *λ* . Of course, in practice we do not know *T* and *λ* and have to estimate it. First we have to decide on a model for *T*, which can be a class of distributions *F x*ð Þ , *θ* . Then *θ* and *λ* have to be estimated using statistical

The compound Poisson distribution *CoP T*ð Þ , *λ* and its VaR are difficult to calculate analytically so that in practice Monte Carlo (MC) simulation is often used. This is done by generating *N* according to the assumed frequency distribution and then by generating *X*1, … , *XN* independent and identically distributed according to the

repeated *I* times independently to obtain *Ai*, *i* ¼ 1, 2, … ,*I* and then the 99.9% VaR is approximated by *A*ð Þ ½ �þ <sup>0</sup>*:*<sup>999</sup> <sup>∗</sup> *<sup>I</sup>* <sup>1</sup> where *A*ð Þ*<sup>i</sup>* denotes the *i*-th order statistic and ½ � *k* the largest integer contained in *k*. Note that three input items are required to perform this, namely the number of repetitions *I* as well as the frequency and loss severity distributions. The number of repetitions determines the accuracy of the approximation and the larger it is, the higher its accuracy. In order to illustrate the Monte Carlo approximation method, we assume that the Burr is the true underlying severity distribution and we use six parameter sets corresponding to an extreme value index (EVI) of 0.33, 0.83, 1.0, 1.33, 1.85 and 2.35 as indicated in **Table 1** below. See Appendix A for a discussion of the characteristics of this distribution and its properties. We take the number of repetitions as *I* ¼ 1 000 000 and repeat the calculation of VaR 1000 times. The 90% band containing the VaR values are shown in **Figure 1** below. Here the lower (upper) bound has been determined as the 5% (95%) percentile of the 1000 VaR values, divided by its median, and by subtracting 1. In mathematical terms the 90% band is defined as

*Median VaR* ð Þ 1, … ,*VaR*<sup>1000</sup> � <sup>1</sup> h i, where *VaR*ð Þ*<sup>k</sup>* denotes the *<sup>k</sup>*-th order statistic. From **Figure 1** it is clear that the spread, as measured by the 90% band, declines with increasing lambda, but increases with increasing EVI.

In principle, infinitely many repetitions are required to get the exact true VaR. The large number of simulation repetitions involved in the MC approaches above motivates the use of other numerical methods such as Panjer recursion, methods based on fast Fourier transforms [5] and the single loss approximation (SLA) method (see e.g. [6]). For a detailed comparison of numerical approximation

η ατ *EVI* 1.00 5.00 0.60 0.33 1.00 2.00 0.60 0.83 1.00 1.00 1.00 1.00 1.00 1.50 0.50 1.33 1.00 0.30 1.80 1.85 1.00 0.17 2.50 2.35

*<sup>n</sup>*¼<sup>1</sup>*Xn* and the distribution of *<sup>A</sup>* is the

*<sup>n</sup>*¼<sup>1</sup>*Xn*. The previous process is

Then the annual aggregate loss is *<sup>A</sup>* <sup>¼</sup> <sup>P</sup>*<sup>N</sup>*

*DOI: http://dx.doi.org/10.5772/intechopen.93722*

true severity distribution *<sup>T</sup>* and calculating *<sup>A</sup>* <sup>¼</sup> <sup>P</sup>*<sup>N</sup>*

estimates.

*VaR*ð Þ <sup>51</sup>

**Table 1.**

**15**

*Parameter sets of Burr distribution.*

*Median VaR* ð Þ 1, … ,*VaR*<sup>1000</sup> � 1, *VaR*ð Þ <sup>951</sup>

Under Basel II's advanced measurement approach, banks may use their own internal models to calculate their operational risk capital, and the LDA is known to be a popular method for this. A bank must be able to demonstrate that their approach captures potentially severe 'tail' events and they must hold capital to protect them against a one-in-a-thousand-year aggregate loss. To determine this capital amount, the 99.9% Value-at-Risk (VaR) of the aggregate distribution is calculated [1]. In order to estimate a one-in-a-thousand-year loss, one would hope that at least a thousand years of historical data is available. However, in reality only between five and ten years of internal data is available and scenario assessments by experts are often used to augment the historical data and to provide a forward-looking view.

The much anticipated implementation of Basel III will require banks to calculate operational risk capital on a new standardised approach, which is simple, risksensitive and comparable between different banks [2]. Although the more sophisticated internal models described above will no longer be allowed in determining minimum regulatory capital, these models will remain relevant for the determination of economic capital and decision making within banks and other financial institutions. It is also suggested that LDA models would form an integral part of the supervisory review of a bank's internal operational risk management process [3]. For this reason, we believe the LDA remains relevant and will continue to be studied and improved on.

In this chapter we provide an exposition of statistical methods that may be used to estimate VaR using historical data in combination with quantile assessments by experts. The proposed approach has been discussed and studied elsewhere (see [4]), but specifically in the context of operational risk and economic capital estimation. In this chapter we concentrate on the estimation of the VaR of the aggregate loss or claims distribution and strive to make the approach more accessible to a wider audience. Also, based on the implementation done for major banks, we include some practical guidelines for the use and implementation of the method in practice. In the next section we discuss two approaches, Monte Carlo and Single Loss Approximation, that may be used for the approximation of VaR assuming known distributions and parameters. Then, in the third section (Historical data and scenario modelling), we will discuss the available sources of data and formulate the scenario approach and how these may be created and assessed by experts. This is followed, in section four (Estimating VaR), by the estimation of VaR using three modelling approaches. In the fifth section (Implementation recommendations) some guidelines on the implementation of the preferred approach are given. Some concluding remarks are made in the last section.
