**Theorem 2.1.6.**

solving algebraic equations are rarely implemented, because they are numerically unstable. In fact, for the coefficients of the characteristic polynomial burdened with rounding errors, and due to numerical instability cause large errors in the eigenvalue. Because of that, the charac‐ teristic polynomial has mainly theoretical significance. The methods, which are based on the direct application characteristic polynomial, are applied in practice only when the character‐ istic polynomial is well conditioned. Also for some structured matrices, we can apply the method for the characteristic polynomial, but we don't calculate directly characteristic

**Example 2.1.1** The Example of structured matrix which achieve the characteristic polynomial for determining the eigenvalue are Toeplitz matrix. Toeplitz matrix marked as *Tn*, are matrices with constant diagonals. If the Toeplitz matrix is symmetric and positive definite, recursive relation is *pn*(*λ*) = *pn* − 1(*λ*)*β<sup>n</sup>* − 1(*λ*), where are *pn* i *pn* − 1 characteristic polynomial matrix *Tn* i *Tn* − **<sup>1</sup>** respectively a *β<sup>n</sup>* − 1 Schur-Szegö parameter for Yule-Walker system. The above recursive relation enables work with characteristic polynomial without individual accounts of his odds.

The following definitions are introducing two important terms: the geometric multiplicity of

The eigenvectors corresponding to λ are the nonzero vectors in the solutions space of

If *λ*0 is an eigenvalue of an *n* × *n* matrix *A*, then the dimension of the eigenspace corresponding to *λ*<sup>0</sup> is called the **geometric multiplicity** of *λ*0, and the number of times that *λ* − *λ*0 appears as

The following are some of the obvious features of eigenvalues of matrix A and corresponding

**a.** If *μ* ≠ 0 complex number, *λ* is an eigenvalue of matrix *A,* and **x** ≠ **0** corresponding eigen‐

**b.** If *k* is a positive integer*, λ* is an eigenvalue of matrix *A,* and **x** ≠ **0** corresponding eigen‐

In linear algebra invertible matrix are important. From the problem of eigenvalues we can easily conclude If the matrix *A* is invertible or not. What more can be, the eigenvalues of the matrix *A* invertible can be immediately read from the eigenvalues of the matrix *A*− 1. Because of that, in the following theorems we summarize some properties of invertible matrix.

and **x** is a corresponding eigenvector.

a factor in the characteristic polynomial of *A* is called the **algebraic multiplicity** of *A.*

Eigenvalue and eigenvector have some specific features, which are easy to prove.

vector, then *μ***x** is a corresponding eigenvector.

**c.** Matrix *A* and *A<sup>T</sup>* have the same eigenvalues.

is an eigenvalue of *A<sup>k</sup>*

(*A* − *λI*)**x** = **0.** We call this solution space the **eigenspace** of *A* corresponding to λ.

polynomial coefficients. The following example describes a class of such matrices.

More information can be found at [1]

*λ*0 and the algebraic multiplicity of *A.*

**Definition 2.1.1.**

60 Applied Linear Algebra in Action

eigenvector:

**Theorem 2.1.5.**

vector, then *λ<sup>k</sup>*

If *A* is an *n* × *n* matrix, then the following are equivalent.


The problem of finding a base *ℝ<sup>n</sup>* consisting of eigenvectors is very important in linear algebra. Because of that, in this section we will consider the following two equivalent problems:

**The Eigenvector Problem.** Given an *n* × *n* matrix *A*, does there exist a basis for *ℝ<sup>n</sup>* consisting of eigenvectors of *A*?

**The Diagonalization Problem (Matrix Form).** Given an *n* × *n* matrix *A*, does there exist an invertible matrix *P* such that *P*<sup>−</sup>**<sup>1</sup>** *AP* is a diagonal matrix?

The latter problem suggests the following terminology.

**Definition 2.1.3.** Two square matrices *A* and *B* are similarly called, if there is invertible matrix *P***,** so that *B* = *P*<sup>−</sup> **<sup>1</sup>** *AP*. The transition from *A* to the matrix of the matrix *B* is called the similarity transformations .

The importance of similar matrices can be seen in the following theorem:

**Theorem 2.1.7.** Similar matrices *A* and *B* have the same characteristic polynomial. They have the same eigenvalues including their geometric multiplicity of the geometric multiplicity of λ<sup>0</sup> and the algebraic multiplicity of λ0*.*

Based on previous definitions, we can define the term diagonalizable as follows:

**Definition 2.1.4**. A square matrix **A** is called diagonalizable if the transformation of similarity may be translated into a diagonal form.

To the above two problems are obviously equivalent to the following theorem.
