*1.5.2 Values of Xs in terms of PCs*

The inverse relation is

$$\mathbf{x} = \mathbf{A}'^{-1}\mathbf{c} = \mathbf{B}\mathbf{c},\tag{19}$$

where

$$\mathbf{B} = \mathbf{A}'^{-1},$$

$$\mathbf{A}' = \mathbf{A}' \mathbf{B} \tag{20}$$

where **B** is the matrix of *loadings* of the *Xv* on the PCs *Cj:* Actually, **A** is an orthonormal matrix (meaning that its columns are of length one and are pairwise orthogonal), so **<sup>A</sup>**�<sup>1</sup> <sup>¼</sup> **<sup>A</sup>**<sup>0</sup> *:* Thus, **B** ¼ **A***:* Therefore,

$$\mathbf{x} = \mathbf{A}'^{-1}\mathbf{c} = \mathbf{A}\mathbf{c}.\tag{21}$$

Letting **a**ð Þ*<sup>v</sup>* <sup>0</sup> be the *v*th row of the matrix **A**, that is,

$$\mathbf{a}^{(v)'} = (a\_{v1}, a\_{v2}, \dots, a\_{vp}),\tag{22}$$

we have, for *v* ¼ 1, 2, … , *p*, the representation of each variable *Xv* in terms of the variables *C*1,*C*2, … ,*Cp* that are the principal components,

$$X\_v = a\_{v1}C\_1 + a\_{v2}C\_2 + \cdots + a\_{vp}C\_p.\tag{23}$$

In terms of the first *k* PCs, this is

$$X\_{\upsilon} = \mathfrak{a}\_{\upsilon1}\mathbf{C}\_1 + \mathfrak{a}\_{\upsilon2}\mathbf{C}\_2 + \dots + \mathfrak{a}\_{\upsilonk}\mathbf{C}\_k + \mathfrak{e}\_{\upsilon}, \quad (\* \ )\tag{24}$$

where the error *ε<sup>v</sup>* is

$$
\epsilon\_{\nu} = a\_{\nu k+1} \mathbf{C}\_{k+1} + a\_{\nu k+2} \mathbf{C}\_{k+2} + \dots + a\_{vp} \mathbf{C}\_p. \tag{25}
$$

The covariance matrix can be represented in terms of its *principal idempotents* **a** *<sup>j</sup>***a**<sup>0</sup> *<sup>j</sup>* as

$$\mathbf{S} = \sum\_{j=1}^{p} \lambda\_j \mathbf{a}\_j \mathbf{a}\_j'. \tag{26}$$

It follows as a result of this representation that the best approximation of rank *k* to **S** is the eigenvalue weighted sum of the first *k* principal idempotents,

$$\mathbf{S}^{(k)} = \sum\_{j=1}^{k} \lambda\_j \mathbf{a}\_j \mathbf{a}\_j'. \tag{27}$$

The weights are all non-negative, recalling that, for a symmetric matrix, such as a covariance matrix, the eigenvalues are non-negative.
