**5.2 Convergence of the distributions of random sums of random vectors with finite covariance matrices to multivariate elliptically contoured Linnik distributions**

In 1953, Yu. V. Linnik [20] introduced the class of univariate symmetric probability distributions defined by the characteristic functions

$$f\_a^L(t) = \frac{1}{1+|t|^a}, \quad t \in \mathbb{R},$$

where *α* ∈ð � 0, 2 . Later, the distributions of this class were called *Linnik distributions* [21] or *α-Laplace distributions* [22]. Here the first term will be used since it has become conventional. With *α* ¼ 2, the Linnik distribution turns into the Laplace distribution corresponding to the density

$$f^{\Lambda}(\mathbf{x}) = \frac{1}{2}e^{-|\mathbf{x}|}, \qquad \mathbf{x} \in \mathbb{R}.$$

A random variable with the Linnik distribution with parameter *α* will be denoted *L*1,*<sup>α</sup>*.

The Linnik distributions possess many interesting analytic properties (see, e.g., [17, 18] and the references therein) but, perhaps, most often Linnik distributions are recalled as examples of geometric stable distributions often used as heavy-tailed models of some statistical regularities in financial data [23, 24].

The multivariate Linnik distribution was introduced by D. N. Anderson in [25] where it was proved that the function

$$f\_{a,\Sigma}^{(L)}(\mathbf{t}) = \frac{1}{\mathbf{1} + \left(\mathbf{t}^\top \Sigma \mathbf{t}\right)^{a/2}}, \quad \mathbf{t} \in \mathbb{R}^r, \quad a \in (0, 2), \tag{19}$$

is the characteristic function of an *r*-variate probability distribution, where Σ is a positive definite ð Þ *r* � *r* -matrix. In [25], the distribution corresponding to the

*From Asymptotic Normality to Heavy-Tailedness via Limit Theorems for Random Sums… DOI: http://dx.doi.org/10.5772/intechopen.89659*

characteristic function (19) was called *the r-variate Linnik distribution*. For the properties of the multivariate Linnik distributions, see [25, 26].

The *r*-variate Linnik distribution can also be defined in another way. For this purpose, recall that the distribution of a nonnegative random variable *M<sup>δ</sup>* whose Laplace transform is

$$\psi\_{\delta}(\mathfrak{s}) \equiv \mathrm{E}e^{-s\mathcal{M}\_{\delta}} = \frac{\mathbf{1}}{\mathbf{1} + \mathfrak{s}^{\delta}}, \qquad \mathfrak{s} \ge \mathbf{0}, \tag{20}$$

where 0 <*δ*≤1, is called *the Mittag-Leffler distribution*. It is another example of heavy-tailed geometrically stable distributions; for more details see for example, [17, 18] and the references therein. The Mittag-Leffler distributions are of serious theoretical interest in the problems related to thinned (or rarefied) homogeneous flows of events such as renewal processes or anomalous diffusion or relaxation phenomena, see [27, 28] and the references therein. In [18], it was demonstrated that

$$L\_{1,a} \stackrel{d}{=} \mathbf{Y}\_1 \cdot \sqrt{2\mathbf{M}\_{a/2}},\tag{21}$$

where *Y*<sup>1</sup> is a random variable with the standard univariate normal distribution independent of the random variable *Mα=*<sup>2</sup> with the Mittag-Leffler distribution with parameter *α=*2.

Now let *Y* be a random vector such that Lð Þ¼ *Y* ΦΣ, where Σ is a positive definite ð Þ *r* � *r* -matrix, independent of the random variable *Mα=*2. By analogy with (21), introduce the random vector *Lr*,*α*,<sup>Σ</sup> as

$$L\_{r,a,\Sigma} = \sqrt{2\mathbf{M}\_{a/2}} \cdot Y.$$

Then, in accordance with what has been said in Section 2,

$$\mathcal{L}(L\_{r,a,\Sigma}) = E\Phi\_{2\mathcal{M}\_{a/2}\Sigma}.\tag{22}$$

The distribution (14) will be called *the* ð *centered*Þ *elliptically contoured multivariate Linnik distribution*.

Using Remark 1, we can easily make sure that the two definitions of the multivariate Linnik distribution coincide. Indeed, with the account of (20), according to Remark 1, the characteristic function of the random vector *Lr*,*α*,<sup>Σ</sup> defined by (22) has the form

$$\mathbb{E}\exp\left\{i\mathbf{t}^{\top}L\_{r,a,\Sigma}\right\}=\boldsymbol{\mathcal{V}}\_{a/2}\left(\mathbf{t}^{\top}\boldsymbol{\Sigma}\mathbf{t}\right)=\frac{\mathbf{1}}{\mathbf{1}+\left(\mathbf{t}^{\top}\boldsymbol{\Sigma}\mathbf{t}\right)^{a/2}}=\boldsymbol{f}\_{a,\Sigma}^{(L)}(\mathbf{t}),\ \mathbf{t}\in\mathbb{R}^{r},\ \mathbf{t}$$

that coincides with Anderson's definition (19).

Our definition (22) together with Theorem 2 opens the way to formulate a theorem stating that the multivariate Linnik distribution can not only be limiting for geometric random sums of independent identically distributed random vectors with infinite second moments [29], but it can also be limiting for random sums of independent random vectors with finite covariance matrices.

Theorem 5. *Let Nn* ! ∞ *in probability as n* ! ∞*. Assume that the random variables X*1, *X*2, … *satisfy condition* ð Þ7 *with an asymptotic covariance matrix* Σ*. Then*

random variables f g *Nn <sup>n</sup>*≥<sup>1</sup> such that for each *n* ≥1 the random variable *Nn* is independent of the sequence f g *Sk <sup>k</sup>* <sup>≥</sup>1. Let f g *bn <sup>n</sup>*≥<sup>1</sup> be an infinitely increasing sequence of positive numbers providing convergence (6) with some positive defi-

> *<sup>L</sup> SNn* ffiffiffiffiffi *dn* p � �

bility distributions defined by the characteristic functions

distribution corresponding to the density

where it was proved that the function

f ð Þ *L <sup>α</sup>*,Σð Þ¼ **t** f *L <sup>α</sup>*ðÞ¼ *t*

*f* <sup>Λ</sup>ð Þ¼ *<sup>x</sup>*

models of some statistical regularities in financial data [23, 24].

1

<sup>1</sup> <sup>þ</sup> **<sup>t</sup>**ð Þ <sup>⊤</sup>Σ**<sup>t</sup>** *<sup>α</sup>=*<sup>2</sup> , **<sup>t</sup>**<sup>∈</sup> *<sup>r</sup>*

positive definite ð Þ *r* � *r* -matrix. In [25], the distribution corresponding to the

Theorem 4. *Let Nn* ! ∞ *in probability as n* ! ∞*. Assume that the random variables X*1, *X*2, … *satisfy condition* ð Þ7 *with an asymptotic covariance matrix* Σ*. Then*

with some infinitely increasing sequence of positive numbers f g *dn <sup>n</sup>*≥<sup>1</sup> and some

) *ζα<sup>=</sup>*2,1

Proof. This theorem is a direct consequence of Theorem 2 with the account of

**5.2 Convergence of the distributions of random sums of random vectors with finite covariance matrices to multivariate elliptically contoured Linnik**

In 1953, Yu. V. Linnik [20] introduced the class of univariate symmetric proba-

where *α* ∈ð � 0, 2 . Later, the distributions of this class were called *Linnik distributions* [21] or *α-Laplace distributions* [22]. Here the first term will be used since it has become conventional. With *α* ¼ 2, the Linnik distribution turns into the Laplace

<sup>1</sup> <sup>þ</sup> j j*<sup>t</sup> <sup>α</sup>* , *<sup>t</sup>* <sup>∈</sup> ,

, *x*∈ *:*

, *α*∈ ð Þ 0, 2 , (19)

1

1 2 *e* �∣*x*∣

A random variable with the Linnik distribution with parameter *α* will be

The Linnik distributions possess many interesting analytic properties (see, e.g., [17, 18] and the references therein) but, perhaps, most often Linnik distributions are recalled as examples of geometric stable distributions often used as heavy-tailed

The multivariate Linnik distribution was introduced by D. N. Anderson in [25]

is the characteristic function of an *r*-variate probability distribution, where Σ is a

*Nn dn*

) *G<sup>α</sup>*,<sup>Σ</sup> ð Þ *n* ! ∞

nite matrix Σ.

*α*∈ ð � 0, 2 , if and only if

*Probability, Combinatorics and Control*

relations (17) and (18).

**distributions**

denoted *L*1,*<sup>α</sup>*.

**178**

as *n* ! ∞.

$$L\left(\frac{\mathbb{S}\_{N\_n}}{\sqrt{d\_n}}\right) \Rightarrow \mathcal{L}(L\_{r,a,\Sigma}) \qquad (n \to \infty)$$

with some infinitely increasing sequence of positive numbers f g *dn <sup>n</sup>*≥<sup>1</sup> and some *α*∈ ð � 0, 2 , if and only if

$$\frac{N\_n}{d\_n} \Rightarrow 2M\_{a/2}$$

as *n* ! ∞.

Proof. This theorem is a direct consequence of Theorem 2 with the account of relation (22).
