Eigenvalues And Eigenvectors

Consider a square matrix A of order n and the set of all n-dimensional vectors. The matrix A is a linear operator on the space of vectors. This means that A operates on each vector producing another vector and that the following property holds:

Consider now the set of vectors x such that the following property holds:

Any vector such that the above property holds is called an eigenvector of the matrix A and the corresponding value of X is called an eigenvalue.

To determine the eigenvectors of a matrix and the relative eigenvalues, consider that the equation Ax = Xx can be written as follows:

which can, in turn, be written as a system of linear equations:

This system of equations has nontrivial solutions only if the matrix A -XI is singular. To determine the eigenvectors and the eigenvalues of the matrix A we must therefore solve the equation

The expansion of this determinant yields a polynomial \$(X) of degree n known as the characteristic polynomial of the matrix A. The equation \$(X) = 0 is known as the characteristic equation of the matrix A. In general, this equation will have n roots Xs which are the eigenvalues of the matrix A. To each of these eigenvalues corresponds a solution of the system of linear equations as illustrated below:

 ai, i — Xs • ai, j • ai, n xi s ai, i • a:, : — V at, n Xis = 0 an, i • an, j • an, n — Xs Xns_

Each solution represents the eigenvector xs corresponding to the eigenvector Xs. As we will see in Chapter 12, the determination of eigenvalues and eigenvectors is the basis for principal component analysis.