**5.1 Convergence of the distributions of random sums of random vectors to multivariate stable laws**

Let Σ be a positive definite ð Þ *r* � *r* -matrix, *α* ∈ð � 0, 2 . A random vector *Zα*,<sup>Σ</sup> is said to have the (centered) elliptically contoured stable distribution *Gα*,<sup>Σ</sup> with characteristic exponent *α*, if its characteristic function g*<sup>α</sup>*,Σð Þ**t** has the form

$$\mathfrak{g}\_{a,\Sigma}(\mathbf{t}) \equiv \mathbf{E} \exp\left\{i\mathbf{t}^\top X\right\} = \exp\left\{-\left(\mathbf{t}^\top \Sigma \mathbf{t}\right)^{a/2}\right\}, \quad \mathbf{t} \in \mathbb{R}'.$$

Univariate stable distributions are popular examples of heavy-tailed distributions. Their moments of orders *δ*≥*α* do not exist (the only exception is the normal law corresponding to *α* ¼ 2). Stable laws and only they can be limit distributions for sums of a non-random number of independent identically distributed random variables with infinite variance under linear normalization. Here it will be shown that they also can be limiting for *random* sums of random vectors with *finite covariance matrices*. The result of this subsection generalizes the main theorem of [19] to a multivariate case.

By *ζα*, we will denote a positive random variable with the one-sided stable distribution corresponding to the characteristic function

$$\mathfrak{g}\_a(t) = \exp\left\{-|t|^a \exp\left\{-\frac{1}{2}i\pi a \text{sign} t\right\}\right\}, \qquad t \in \mathbb{R},$$

with 0< *α*≤1 (for more details see [15] or [4]).

Let *α*∈ ð � 0, 2 . It is known that, if *Y* is a random vector such that Lð Þ¼ *Y* ΦΣ independent of the random variable *ζα=*2, then

$$Z\_{a,\Sigma} \stackrel{d}{=} \sqrt{\zeta\_{a/2}} \cdot \mathbf{Y} \tag{17}$$

(see Proposition 2.5.2 in [4]). In other words,

$$\mathbf{G}\_{a,\Sigma} = \mathbf{E} \Phi\_{\zeta\_{a/2}\Sigma}.\tag{18}$$

As in Section 3, let *X*1, *X*2, … be independent *r*-valued random vectors. For *n*∈ , denote *Sn* ¼ *X*<sup>1</sup> þ … þ *Xn*. Consider a sequence of integer-valued positive

Moreover, in this case, <sup>Σ</sup><sup>0</sup> <sup>¼</sup> *<sup>c</sup>*�1Σ.

*Probability, Combinatorics and Control*

Lemma 1.

itself.

It is easy to see that

with more flat vertices.

For the proof see [10].

only if *U* is non-random.

<sup>E</sup>*U*�<sup>1</sup> <sup>¼</sup> <sup>1</sup>*. Then*

*x*≥0

**176**

coefficient (kurtosis) *κ*ð Þ *Y* is defined as

*moreover, let* E*X* ¼ 0 *and* Pð Þ¼ *U* ≥0 1*. Then*

heavier tails than normal laws themselves.

This statement immediately follows from Theorem 2 with the account of

**4. Some remarks on the heavy-tailedness of scale mixtures of normals**

The one-dimensional marginals of the multivariate limit law in Theorems 2 and 3 are scale mixtures of normals with zero means of the form EΦð Þ *x=U* , *x*∈ , where Φð Þ *x* is the standard normal distribution function and *U* is a nonnegative random variable. It turns out, although absolutely not so evident, that these distributions are *always* leptokurtic having sharper vertex and heavier tails than the normal law

EΦð Þ¼ *x=U* Pð Þ *X* � *U* <*x* , *x*∈ ,

leptokurtosity, consider the excess coefficient which is traditionally used in

where *X* is a standard normal variable independent of *U*. First, as a measure of

(descriptive) statistics. Recall that for a random variable *Y* with E*Y*<sup>4</sup> < ∞, the excess

*<sup>κ</sup>*ð Þ¼ *<sup>Y</sup>* <sup>E</sup> *<sup>Y</sup>* � <sup>E</sup>*<sup>Y</sup>* ffiffiffiffiffiffiffi

<sup>D</sup>*<sup>Y</sup>* <sup>p</sup> � �<sup>4</sup>

If Pð Þ¼ *X* <*x* Φð Þ *x* , then *κ*ð Þ¼ *X* 3. Densities with sharper vertices (and, respectively, with heavier tails) than the normal density, have *κ* > 3, and *κ* <3 for densities

Lemma 2. *Let X and U be independent random variables with finite fourth moments;*

*κ*ð Þ *XU* ≥ *κ*ð Þ *X :*

So, if *X* is a standard normal random variable and *U* is a nonnegative random variable with E*U*<sup>4</sup> <sup>&</sup>lt; <sup>∞</sup> independent of *<sup>X</sup>*, then *<sup>κ</sup>*ð Þ *<sup>X</sup>* � *<sup>U</sup>* <sup>≥</sup>3 and *<sup>κ</sup>*ð Þ¼ *<sup>X</sup>* � *<sup>U</sup>* 3 if and

Using the Jensen inequality, we can easily obtain one more inequality directly connecting the tails of the normal mixtures with the tails of the normal distribution. Lemma 3. *Assume that the random variable U satisfies the normalization condition*

1 � EΦð Þ *x=U* ≥1 � Φð Þ *x* , *x* > 0*:*

Pð Þ j*X* � *U*j≥ *x* ≥ Pð Þ¼ j*X*j≥*x* ð Þ 2 1½ � � Φð Þ *x* ,

that is, scale mixtures of normal laws are always more leptokurtic and have

From Lemma 3, it follows that if *X* is the standard normal random variable and *<sup>U</sup>* is a nonnegative random variable independent of *<sup>X</sup>* with E*U*�<sup>1</sup> <sup>¼</sup> 1, then for any

*Furthermore, κ*ð Þ¼ *XU κ*ð Þ *X if and only if* Pð Þ¼ *U* ¼ const 1*.*

*:*

random variables f g *Nn <sup>n</sup>*≥<sup>1</sup> such that for each *n* ≥1 the random variable *Nn* is independent of the sequence f g *Sk <sup>k</sup>* <sup>≥</sup>1. Let f g *bn <sup>n</sup>*≥<sup>1</sup> be an infinitely increasing sequence of positive numbers providing convergence (6) with some positive definite matrix Σ.

characteristic function (19) was called *the r-variate Linnik distribution*. For the

*From Asymptotic Normality to Heavy-Tailedness via Limit Theorems for Random Sums…*

The *r*-variate Linnik distribution can also be defined in another way. For this purpose, recall that the distribution of a nonnegative random variable *M<sup>δ</sup>* whose

�*sM<sup>δ</sup>* <sup>¼</sup> <sup>1</sup>

where 0 <*δ*≤1, is called *the Mittag-Leffler distribution*. It is another example of heavy-tailed geometrically stable distributions; for more details see for example, [17, 18] and the references therein. The Mittag-Leffler distributions are of serious theoretical interest in the problems related to thinned (or rarefied) homogeneous flows of events such as renewal processes or anomalous diffusion or relaxation

> ffiffiffiffiffiffiffiffiffiffiffiffi 2*Mα=*<sup>2</sup> q

where *Y*<sup>1</sup> is a random variable with the standard univariate normal distribution independent of the random variable *Mα=*<sup>2</sup> with the Mittag-Leffler distribution with

> ffiffiffiffiffiffiffiffiffiffiffiffi 2*Mα=*<sup>2</sup> q

The distribution (14) will be called *the* ð *centered*Þ *elliptically contoured multivar-*

Using Remark 1, we can easily make sure that the two definitions of the multivariate Linnik distribution coincide. Indeed, with the account of (20), according to Remark 1, the characteristic function of the random vector *Lr*,*α*,<sup>Σ</sup> defined by (22) has

<sup>⊤</sup>Σ**<sup>t</sup>** � � <sup>¼</sup> <sup>1</sup>

Our definition (22) together with Theorem 2 opens the way to formulate a theorem stating that the multivariate Linnik distribution can not only be limiting for geometric random sums of independent identically distributed random vectors with infinite second moments [29], but it can also be limiting for random sums of

Theorem 5. *Let Nn* ! ∞ *in probability as n* ! ∞*. Assume that the random variables X*1, *X*2, … *satisfy condition* ð Þ7 *with an asymptotic covariance matrix* Σ*. Then*

� *Y:*

<sup>1</sup> <sup>þ</sup> **<sup>t</sup>**ð Þ <sup>⊤</sup>Σ**<sup>t</sup>** *<sup>α</sup>=*<sup>2</sup> <sup>¼</sup> <sup>f</sup>

Lð Þ¼ *Lr*,*α*,<sup>Σ</sup> *E*Φ2*Mα=*2Σ*:* (22)

ð Þ *L*

*<sup>α</sup>*,Σð Þ**<sup>t</sup>** , **<sup>t</sup>** <sup>∈</sup> *<sup>r</sup>*

,

Now let *Y* be a random vector such that Lð Þ¼ *Y* ΦΣ, where Σ is a positive definite ð Þ *r* � *r* -matrix, independent of the random variable *Mα=*2. By analogy with

<sup>1</sup> <sup>þ</sup> *<sup>s</sup><sup>δ</sup>* , *<sup>s</sup>*<sup>≥</sup> 0, (20)

, (21)

properties of the multivariate Linnik distributions, see [25, 26].

*DOI: http://dx.doi.org/10.5772/intechopen.89659*

*ψδ*ðÞ� *s* E*e*

phenomena, see [27, 28] and the references therein. In [18], it was

*L*1,*<sup>α</sup>* ¼ *<sup>d</sup> <sup>Y</sup>*<sup>1</sup> �

*Lr*,*α*,<sup>Σ</sup> ¼

Then, in accordance with what has been said in Section 2,

Laplace transform is

demonstrated that

parameter *α=*2.

*iate Linnik distribution*.

E exp *i***t**

<sup>⊤</sup>*Lr*,*α*,<sup>Σ</sup> � � <sup>¼</sup> *ψα<sup>=</sup>*<sup>2</sup> **<sup>t</sup>**

that coincides with Anderson's definition (19).

independent random vectors with finite covariance matrices.

the form

**179**

(21), introduce the random vector *Lr*,*α*,<sup>Σ</sup> as

Theorem 4. *Let Nn* ! ∞ *in probability as n* ! ∞*. Assume that the random variables X*1, *X*2, … *satisfy condition* ð Þ7 *with an asymptotic covariance matrix* Σ*. Then*

$$L\left(\frac{\mathcal{S}\_{N\_n}}{\sqrt{d\_n}}\right) \Rightarrow G\_{a,\Sigma} \qquad (n \to \infty)$$

with some infinitely increasing sequence of positive numbers f g *dn <sup>n</sup>*≥<sup>1</sup> and some *α*∈ ð � 0, 2 , if and only if

$$\frac{N\_n}{d\_n} \Rightarrow \zeta\_{a/2,1}$$

as *n* ! ∞.

Proof. This theorem is a direct consequence of Theorem 2 with the account of relations (17) and (18).
