On-Line Geometric Modeling Notes
EIGENVALUES AND EIGENVECTORS
In engineering applications, eigenvalue problems are among the most
important problems connected with matrices. In this section we give
the basic definitions of eigenvalues and eigenvectors and some of the
basic results of their use.
For a pdf version of these notes look
here
What are Eigenvalues and Eigenvectors?
Let
be an
matrix and consider the vector equation
where
is a scalar value.
It is clear that if
, we have a solution for any value of
. A value of
for which the equation has a solution
with
is called an eigenvalue or
characteristic value of the matrix
. The corresponding solutions
are called eigenvectors or
characteristic vectors of
.
In the problem above, we are looking for vectors that
when multiplied by the matrix
, give a scalar multiple of itself.
The set of eigenvalues of
is commonly called the spectrum of
and the largest of the absolute values of the eigenvalues is
called the spectral radius of
.
How Do We Calculate the Eigenvalues?
It is easy to see that the equation
can be rewritten as
where
is the identity matrix.
A matrix equation of this form can only be solved if the
determinant of the matrix is nonzero (see
Cramer's Rule)
- that is, if
Since this equation is a
polynomial in
, commonly called the characteristic
polynomial, we only need to find the roots of this
polynomial to find the eigenvalues.
We note that to get a complete set of eigenvalues, one may have to
extend the scope of this discussion into the field of complex numbers.
How Do We Calculate the Eigenvectors?
The eigenvalues must be determined first. Once these are known, the
corresponding eigenvectors can be calculated directly from the linear
system
It should be noted that if
is an eigenvector, then so is
for any scalar
.
Right Eigenvectors
Given an eigenvalue
, The eigenvector
that satisfies
is sometimes called a (right) eigenvector for the
matrix
corresponding to the eigenvalue
.
If
are the eigenvalues and
are the corresponding right eigenvectors,
then is easy to
see that the set of right eigenvectors form a basis of a
vector space.
If this vector space is of dimension
, then we can construct
an
matrix
whose columns are the components of the
right eigenvectors, which has the property that
where
is the diagonal matrix
whose diagonal elements are the eigenvalues.
By appropriate numbering of the eigenvalues and eigenvectors,
it is possible to arrange the columns of the matrix
so that
.
Left Eigenvectors
A vector
so that
is called a left eigenvector for
corresponding to the eigenvalue
.
If
are the eigenvalues and
are the corresponding left eigenvectors,
then is easy to
see that the set of left eigenvectors form a basis of a
vector space.
If this vector space is of dimension
, then we can construct
an
matrix
whose rows are the components of the
left eigenvectors, which has the property that
It is possible to choose the left eigenvectors
and
right eigenvectors
so that
This is easily done if we define
and define the
components of the left
eigenvectors to be the elements of the respective rows of
.
Beginning with
and multiplying both sides
on the left by
, we obtain
and multiplying on the right by
, we have
which implies that any row of
satisfies the properties of a
left eigenvector.
Diagonalization of a Matrix
Given an
matrix
, we say that
is diagonalizable if
there is a matrix
so that
where
It is clear from the above discussions that if all the eigenvalues are
real and district, then we can use the matrix of right eigenvectors
as
.
Ken Joy
2000-11-28