**2. Notation and definitions: auxiliary results**

Let *r*∈ . We will consider random elements taking values in the *r*-dimensional Euclidean space *<sup>r</sup>* .

Assume that all the random variables and random vectors are defined on one and the same probability space ð Þ Ω, A, P . By the measurability of a random field, we will mean its measurability as a function of two variates, an elementary outcome

results of B. Mandelbrot and others who proposed, instead of the standard central limit theorem, to use reasoning based on limit theorems for sums of random summands with infinite variances (see, e.g., [4]) resulting in non-normal stable laws as heavy-tailed models of the distributions of experimental data. However, first, in most cases the key assumption within this approach, the infiniteness of the variances of elementary summands can hardly be believed to hold in practice and, second, although more heavy-tailed than the normal law, the real distributions

In this work, in order to give a more realistic explanation of the observed nonnormality of the distributions of real data, an alternative approach based on limit theorems for statistics constructed from samples with random sizes is developed. Within this approach, it becomes possible to obtain arbitrarily heavy tails of the data distributions without assuming the non-existence of the moments of the

This work was inspired by the publication of the paper [5] in which, based on the results of [6], a particular case of random sums was considered. One more reason for writing this work was the recent publication [7], the authors of which

Here we give a more general description of the transformation of the limit distribution of a sum of independent random variables or another statistic (i.e., of a measurable function of a sample) under the replacement of the non-random number of summands or the sample size by a random variable. General limit theorems are proved (Section 3). Section 4 contains some comments on heavy-tailedness of scale mixtures of normal distributions. As examples of the application of general theorems, conditions are presented for the convergence of the distributions of random sums of independent random vectors *with finite covariance matrices* to multivariate elliptically contoured stable and Linnik distributions (Section 5). Also, conditions are presented for the convergence of the distributions of asymptotically normal (in the traditional sense) statistics to multivariate Student distributions

In Section 7, the joint asymptotic behavior of sample quantiles is considered. In applied researches related to risk analysis, such characteristic as VaR (Value-at-Risk) is very popular. Formally, VaR is a certain quantile of the observed risky value. Therefore, the joint asymptotic behavior of sample quantiles in samples with

application of the general theorem proved in Section 3. In this section, we show how the proposed technique can be applied to the continuous-time case assuming that the sample size increases in time following a Cox process. One more interpretation of this setting is related with an important case where the sample size has the mixed

In classical problems of mathematical statistics, the size of the available sample, that is, the number of available observations, is traditionally assumed to be deterministic. In the asymptotic settings, it plays the role of infinitely increasing *known* parameter. At the same time, in practice very often the data to be analyzed are collected or registered during a certain period of time and the flow of informative events each of which brings a next observation forms a random point process. Therefore, the number of available observations is unknown till the end of the process of their registration and also must be treated as a (random) observation. For example, this is so in insurance statistics where, during different accounting periods, different numbers of insurance events (insurance claims and/or insurance contracts) occur and in high-frequency financial statistics where the number of events in a limit order book during a time unit essentially depends on the intensity of order flows. Moreover, contemporary statistical procedures of insurance and

random sizes is considered in detail in Section 7 as one more example of the

reproduced some results of [8, 9] without citing these earlier papers.

often turn out to be more light-tailed than the stable laws.

observed characteristics.

*Probability, Combinatorics and Control*

(Section 6).

Poisson distribution.

**168**

and a parameter, with respect to the Cartesian product of the *σ*-algebra A and the Borel *<sup>σ</sup>*-algebra <sup>B</sup> *<sup>r</sup>* ð Þ of subsets of *<sup>r</sup>* .

EΦ*U*Σð Þ¼ *A*

f g EΦ*U*Σð Þ� : *U* ∈U *is identifiable in the sense that if U*<sup>1</sup> ∈U*, U*<sup>2</sup> ∈U, *and*

<sup>v</sup>ð Þ *<sup>U</sup>* ð Þ**<sup>t</sup>** corresponding to the distribution EΦ*<sup>U</sup>*Σð Þ� has the form

<sup>v</sup>ð Þ *<sup>U</sup>* ð Þ¼ **<sup>t</sup>** *<sup>ψ</sup>*

exp f g �*us <sup>d</sup>*Pð Þ *<sup>U</sup>* <sup>&</sup>lt;*<sup>u</sup>* , *<sup>s</sup>* <sup>¼</sup> <sup>1</sup>

*<sup>d</sup> U*2.

Let U be the set of all nonnegative random variables.

*U* <sup>p</sup> *<sup>Y</sup>* � �.

*DOI: http://dx.doi.org/10.5772/intechopen.89659*

for any set *<sup>A</sup>* <sup>∈</sup><sup>B</sup> *<sup>r</sup>* ð Þ, then *<sup>U</sup>*<sup>1</sup> <sup>¼</sup>

exp � <sup>1</sup> 2 **t** *<sup>Τ</sup>* ð Þ *<sup>u</sup>*<sup>Σ</sup> **<sup>t</sup>** � �*d*Pð Þ¼ *<sup>U</sup>* <sup>&</sup>lt;*<sup>u</sup>*

whence, in turn, it follows that *U*<sup>1</sup> ¼

*r*-valued random vectors, and for *n*∈ let

the scalar normalized random vector

*<sup>U</sup>*, then EΦ*U*<sup>Σ</sup> ¼ L ffiffiffiffi

<sup>v</sup>ð Þ *<sup>U</sup>* ð Þ¼ **<sup>t</sup>**

the form

**171**

ð<sup>∞</sup> 0

¼ ð<sup>∞</sup> 0

**3. General theorems**

ð<sup>∞</sup> 0

*From Asymptotic Normality to Heavy-Tailedness via Limit Theorems for Random Sums…*

It is easy to see that if *Y* is a random vector such that Lð Þ¼ *Y* ΦΣ independent of

Lemma 1. *Whatever nonsingular covariance matrix* Σ *is, the family of distributions*

The proof of this lemma is very simple. If *U* ∈ U, then the characteristic function

2 **t**

Remark 1. When proving Lemma 1, we established a simple but useful byproduct result: if *ψ*ð Þ*s* is the Laplace-Stieltjes transform of the random variable *U*, then the characteristic function <sup>v</sup>ð Þ *<sup>U</sup>* ð Þ**<sup>t</sup>** corresponding to the distribution EΦ*<sup>U</sup>*<sup>Σ</sup> has

> 1 2 **t** <sup>⊤</sup>Σ**t** � �, **<sup>t</sup>**<sup>∈</sup> *<sup>r</sup>*

First, consider the case where the random vectors f g *Sn <sup>n</sup>*≥<sup>1</sup> are formed as growing sums of independent random variables. Namely, let *X*1,*X*2, … be independent

*Sn* ¼ *X*<sup>1</sup> þ … þ *Xn:*

<sup>L</sup> *Sn* ffiffiffiffiffi *bn* p � �

as *n* ! ∞, where Σ is some positive definite matrix.

Consider a sequence of integer-valued positive random variables f g *Nn <sup>n</sup>*≥<sup>1</sup> such that for each *n* ≥1 the random variable *Nn* is independent of the sequence f g *Sk <sup>k</sup>*≥1. Let f g *bn <sup>n</sup>*≥<sup>1</sup> be an infinitely increasing sequence of positive numbers such that

Let f g *dn <sup>n</sup>*≥<sup>1</sup> be an infinitely increasing sequence of positive numbers. As *Zn* take

But on the right-hand side of (5), there is the Laplace-Stieltjes transform of the random variable *<sup>U</sup>*. From (4), it follows that <sup>v</sup>ð Þ *<sup>U</sup>*<sup>1</sup> ð Þ� **<sup>t</sup>** <sup>v</sup>ð Þ *<sup>U</sup>*<sup>2</sup> ð Þ**<sup>t</sup>** whence by virtue of (5) the Laplace-Stieltjes transforms of the random variables *U*<sup>1</sup> and *U*<sup>2</sup> coincide,

ð<sup>∞</sup> 0

*<sup>Τ</sup>*Σ**t**, **t**∈ *<sup>r</sup>*

*<sup>d</sup> U*2. The lemma is proved.

Φ*u*Σð Þ *A d*Pð Þ *U* <*u :*

EΦ*U*1<sup>Σ</sup>ð Þ¼ *A* EΦ*U*2<sup>Σ</sup>ð Þ *A* (4)

exp � <sup>1</sup> 2 *u***t** *<sup>Τ</sup>*Σ**t** � �*d*Pð Þ *<sup>U</sup>* <sup>&</sup>lt;*<sup>u</sup>*

,

(5)

*:* (6)

) ΦΣ (7)

The distribution of a random vector *ξ* with respect to the measure P will be denoted Lð Þ*ξ* . The weak convergence, the coincidence of distributions, and the convergence in probability with respect to a specified probability measure will be denoted by the symbols ), ¼ *<sup>d</sup>* , and ! *<sup>P</sup>* , respectively.

Let Σ be a positive definite matrix. The normal distribution in *<sup>r</sup>* with zero vector of expectations and covariance matrix Σ will be denoted ΦΣ. This distribution is defined by its density

$$\phi(\mathbf{x}) = \frac{\exp\left\{-\frac{1}{2}\mathbf{x}^T\Sigma^{-1}\mathbf{x}\right\}}{\left(2\pi\right)^{r/2}|\Sigma|^{1/2}}, \qquad \mathbf{x} \in \mathbb{R}^r.$$

The characteristic function f *<sup>Y</sup>*ð Þ**<sup>t</sup>** of a random variable *<sup>Y</sup>* such that Lð Þ¼ *<sup>Y</sup>* ΦΣ has the form

$$\mathbf{f}^{Y}(\mathbf{t}) \equiv \mathbf{E} \exp\left\{i\mathbf{t}^{\top}Y\right\} = \exp\left\{-\frac{1}{2}\mathbf{t}^{\top}\boldsymbol{\Sigma}\mathbf{t}\right\}, \quad \mathbf{t} \in \mathbb{R}^{r}.\tag{2}$$

Consider a sequence f g *Sn <sup>n</sup>*≥<sup>1</sup> of random elements taking values in *<sup>r</sup>* . Let <sup>Ξ</sup> *<sup>r</sup>* ð Þ be the set of all nonsingular linear operators acting from *<sup>r</sup>* to *<sup>r</sup>* . The identity operator acting from *<sup>r</sup>* to *<sup>r</sup>* will be denoted *Ir*. Assume that there exist sequences f g *Bn <sup>n</sup>*≥<sup>1</sup> of operators from <sup>Ξ</sup> *<sup>r</sup>* ð Þ and f g *an <sup>n</sup>* <sup>≥</sup><sup>1</sup> of elements from *<sup>r</sup>* such that

$$Y\_n \equiv B\_n^{-1} (\mathbb{S}\_n - \mathfrak{a}\_n) \Rightarrow Y \quad (n \to \infty) \tag{3}$$

where *Y* is a random element whose distribution with respect to P will be denoted *H*, *H* ¼ Lð Þ *Y* .

Along with f g *Sn <sup>n</sup>*≥<sup>1</sup>, consider a sequence of integer-valued positive random variables f g *Nn <sup>n</sup>*≥<sup>1</sup> such that for each *n*≥ 1 the random variable *Nn* is independent of the sequence f g *Sk <sup>k</sup>*≥1. Let *cn* <sup>∈</sup> *<sup>r</sup>* , *Dn* <sup>∈</sup> <sup>Ξ</sup> *<sup>r</sup>* ð Þ, and *<sup>n</sup>* <sup>≥</sup>1. Now, we will formulate sufficient conditions for the weak convergence of the distributions of the random elements *Zn* <sup>¼</sup> *<sup>D</sup>*�<sup>1</sup> *<sup>n</sup>* ð Þ *SNn* � *cn* as *n* ! ∞.

For *g* ∈ *<sup>r</sup>* , denote *Wn*ð Þ¼ *<sup>g</sup> <sup>D</sup>*�<sup>1</sup> *<sup>n</sup> BNn* ð Þ *g* þ *aNn* � *cn* . In [13, 14], the following theorem was proved, which establishes sufficient conditions of the weak convergence of multivariate random sequences with independent random indices under operator normalization.

Theorem 1 [14]. *Let* ∥*D*�<sup>1</sup> *<sup>n</sup>* ∥ ! ∞ *as n* ! ∞ *and let the sequence of random variables* ∥*D*�<sup>1</sup> *<sup>n</sup> BNn* <sup>∥</sup> *<sup>n</sup>*≥<sup>1</sup> *be tight. Assume that there exist a random element Y with distribution H and an r-dimensional random field W g*ð Þ*, g* <sup>∈</sup> *<sup>r</sup> , such that* ð Þ3 *holds and*

$$W\_n(\mathbf{g}) \Rightarrow W(\mathbf{g}) \qquad \qquad (n \to \infty)$$

for *H*-almost all *g* ∈ *<sup>r</sup>* . Then the random field *W g*ð Þ is measurable, linearly depends on *g* and

$$Z\_n \Rightarrow W(Y) \qquad \qquad (n \to \infty),$$

where the random field *W*ð Þ� and the random element *Y* are independent.

Now, consider an auxiliary statement dealing with the identifiability of a special family of mixtures of multivariate normal distributions. Let *U* be a nonnegative random variable. The symbol EΦ*<sup>U</sup>*Σð Þ� will denote the distribution which for each Borel set *A* in *<sup>r</sup>* is defined as

*From Asymptotic Normality to Heavy-Tailedness via Limit Theorems for Random Sums… DOI: http://dx.doi.org/10.5772/intechopen.89659*

$$\mathbf{E}\Phi\_{U\Sigma}(\mathcal{A}) = \int\_0^\infty \Phi\_{\mathfrak{u}\Sigma}(\mathcal{A}) d\mathbf{P}(U < \mathfrak{u})\,.$$

Let U be the set of all nonnegative random variables.

It is easy to see that if *Y* is a random vector such that Lð Þ¼ *Y* ΦΣ independent of *<sup>U</sup>*, then EΦ*U*<sup>Σ</sup> ¼ L ffiffiffiffi *U* <sup>p</sup> *<sup>Y</sup>* � �.

Lemma 1. *Whatever nonsingular covariance matrix* Σ *is, the family of distributions* f g EΦ*U*Σð Þ� : *U* ∈U *is identifiable in the sense that if U*<sup>1</sup> ∈U*, U*<sup>2</sup> ∈U, *and*

$$\mathbf{E}\Phi\_{U\_1\Sigma}(\mathbf{A}) = \mathbf{E}\Phi\_{U\_2\Sigma}(\mathbf{A})\tag{4}$$

for any set *<sup>A</sup>* <sup>∈</sup><sup>B</sup> *<sup>r</sup>* ð Þ, then *<sup>U</sup>*<sup>1</sup> <sup>¼</sup> *<sup>d</sup> U*2.

and a parameter, with respect to the Cartesian product of the *σ*-algebra A and the

*<sup>P</sup>* , respectively. Let Σ be a positive definite matrix. The normal distribution in *<sup>r</sup>* with zero vector of expectations and covariance matrix Σ will be denoted ΦΣ. This distribu-

> <sup>2</sup> **<sup>x</sup>***<sup>Τ</sup>*Σ�<sup>1</sup> **x**

<sup>⊤</sup>*<sup>Y</sup>* <sup>¼</sup> exp � <sup>1</sup>

operator acting from *<sup>r</sup>* to *<sup>r</sup>* will be denoted *Ir*. Assume that there exist sequences

Consider a sequence f g *Sn <sup>n</sup>*≥<sup>1</sup> of random elements taking values in *<sup>r</sup>*

f g *Bn <sup>n</sup>*≥<sup>1</sup> of operators from <sup>Ξ</sup> *<sup>r</sup>* ð Þ and f g *an <sup>n</sup>* <sup>≥</sup><sup>1</sup> of elements from *<sup>r</sup>* such that

where *Y* is a random element whose distribution with respect to P will be

Along with f g *Sn <sup>n</sup>*≥<sup>1</sup>, consider a sequence of integer-valued positive random variables f g *Nn <sup>n</sup>*≥<sup>1</sup> such that for each *n*≥ 1 the random variable *Nn* is independent of

sufficient conditions for the weak convergence of the distributions of the random

orem was proved, which establishes sufficient conditions of the weak convergence of multivariate random sequences with independent random indices under operator

*Wn*ð Þ)*g W g*ð Þ ð Þ *n* ! ∞

*Zn* ) *W Y*ð Þ ð Þ *n* ! ∞ ,

where the random field *W*ð Þ� and the random element *Y* are independent. Now, consider an auxiliary statement dealing with the identifiability of a special family of mixtures of multivariate normal distributions. Let *U* be a nonnegative random variable. The symbol EΦ*<sup>U</sup>*Σð Þ� will denote the distribution which for each

be the set of all nonsingular linear operators acting from *<sup>r</sup>* to *<sup>r</sup>*

*Yn* � *<sup>B</sup>*�<sup>1</sup>

*<sup>n</sup>* ð Þ *SNn* � *cn* as *n* ! ∞.

*bution H and an r-dimensional random field W g*ð Þ*, g* <sup>∈</sup> *<sup>r</sup>*

, denote *Wn*ð Þ¼ *<sup>g</sup> <sup>D</sup>*�<sup>1</sup>

j j <sup>Σ</sup> <sup>1</sup>*=*<sup>2</sup> , **<sup>x</sup>**<sup>∈</sup> *<sup>r</sup>*

2 **t** <sup>⊤</sup>Σ**t**  *:*

, **t** ∈ *<sup>r</sup>*

*:* (2)

. The identity

. Let <sup>Ξ</sup> *<sup>r</sup>* ð Þ

*<sup>Y</sup>*ð Þ**<sup>t</sup>** of a random variable *<sup>Y</sup>* such that Lð Þ¼ *<sup>Y</sup>* ΦΣ

*<sup>n</sup>* ð Þ) *Sn* � *an Y n*ð Þ ! ∞ (3)

, *Dn* <sup>∈</sup> <sup>Ξ</sup> *<sup>r</sup>* ð Þ, and *<sup>n</sup>* <sup>≥</sup>1. Now, we will formulate

*<sup>n</sup> BNn* ð Þ *g* þ *aNn* � *cn* . In [13, 14], the following the-

*, such that* ð Þ3 *holds and*

*<sup>n</sup>* ∥ ! ∞ *as n* ! ∞ *and let the sequence of random vari-*

*<sup>n</sup>*≥<sup>1</sup> *be tight. Assume that there exist a random element Y with distri-*

. Then the random field *W g*ð Þ is measurable, linearly

. The distribution of a random vector *ξ* with respect to the measure P will be denoted Lð Þ*ξ* . The weak convergence, the coincidence of distributions, and the convergence in probability with respect to a specified probability measure will be

*<sup>d</sup>* , and !

exp � <sup>1</sup>

ð Þ <sup>2</sup>*<sup>π</sup> <sup>r</sup>=*<sup>2</sup>

*ϕ*ð Þ¼ **x**

*<sup>Y</sup>*ð Þ� **<sup>t</sup>** E exp *<sup>i</sup>***<sup>t</sup>**

Borel *<sup>σ</sup>*-algebra <sup>B</sup> *<sup>r</sup>* ð Þ of subsets of *<sup>r</sup>*

*Probability, Combinatorics and Control*

denoted by the symbols ), ¼

tion is defined by its density

has the form

denoted *H*, *H* ¼ Lð Þ *Y* .

elements *Zn* <sup>¼</sup> *<sup>D</sup>*�<sup>1</sup>

For *g* ∈ *<sup>r</sup>*

normalization.

depends on *g* and

**170**

*ables* ∥*D*�<sup>1</sup>

the sequence f g *Sk <sup>k</sup>*≥1. Let *cn* <sup>∈</sup> *<sup>r</sup>*

Theorem 1 [14]. *Let* ∥*D*�<sup>1</sup>

for *H*-almost all *g* ∈ *<sup>r</sup>*

Borel set *A* in *<sup>r</sup>* is defined as

*<sup>n</sup> BNn* <sup>∥</sup>

The characteristic function f

f

The proof of this lemma is very simple. If *U* ∈ U, then the characteristic function <sup>v</sup>ð Þ *<sup>U</sup>* ð Þ**<sup>t</sup>** corresponding to the distribution EΦ*<sup>U</sup>*Σð Þ� has the form

$$\begin{split} \mathbf{v}^{(U)}(\mathbf{t}) &= \int\_{0}^{\infty} \exp\left\{-\frac{1}{2} \mathbf{t}^{T} (u\Sigma)\mathbf{t}\right\} d\mathbf{P}(U < u) = \int\_{0}^{\infty} \exp\left\{-\frac{1}{2} u\mathbf{t}^{T} \Sigma \mathbf{t}\right\} d\mathbf{P}(U < u) \\ &= \int\_{0}^{\infty} \exp\left\{-u\sigma\right\} d\mathbf{P}(U < u), \ \ \ \ \ \mathbf{s} = \frac{1}{2} \mathbf{t}^{T} \Sigma \mathbf{t}, \ \ \mathbf{t} \in \mathbb{R}^{r}, \end{split} \tag{5}$$

But on the right-hand side of (5), there is the Laplace-Stieltjes transform of the random variable *<sup>U</sup>*. From (4), it follows that <sup>v</sup>ð Þ *<sup>U</sup>*<sup>1</sup> ð Þ� **<sup>t</sup>** <sup>v</sup>ð Þ *<sup>U</sup>*<sup>2</sup> ð Þ**<sup>t</sup>** whence by virtue of (5) the Laplace-Stieltjes transforms of the random variables *U*<sup>1</sup> and *U*<sup>2</sup> coincide,

whence, in turn, it follows that *U*<sup>1</sup> ¼ *<sup>d</sup> U*2. The lemma is proved.

Remark 1. When proving Lemma 1, we established a simple but useful byproduct result: if *ψ*ð Þ*s* is the Laplace-Stieltjes transform of the random variable *U*, then the characteristic function <sup>v</sup>ð Þ *<sup>U</sup>* ð Þ**<sup>t</sup>** corresponding to the distribution EΦ*<sup>U</sup>*<sup>Σ</sup> has the form

$$\psi^{(U)}(\mathbf{t}) = \psi\left(\frac{1}{2}\mathbf{t}^{\top}\Sigma\mathbf{t}\right), \quad \mathbf{t} \in \mathbb{R}^{r}.\tag{6}$$
