a b
c d
= ad bc
There are many interesting but not obvious properties of determinants. It is true that
det A = a
i1
C
i1
+ a
i2
C
i2
+ + a
in
C
in
for any 1 i n . It is also true that det A = det A
T
, so that we have
det A = a
1j
C
1j
+ a
2j
C
2j
+ + a
nj
C
nj
for any 1 j n .
If A and B are matrices of the same order, then det AB = (det A)(det B) , and the determinant
of any identity matrix is 1. Perhaps the most important property of the determinant is the fact that a
matrix is invertible if and only if its determinant is not zero.
197.8 Eigenvalues and Eigenvectors
If A is a square matrix, and Av = v for a scalar and a nonzero v, then is an eigenvalue of A
and v is an eigenvector of A that corresponds to . Any nonzero linear combination of
eigenvectors corresponding to the same eigenvalue is also an eigenvector corresponding to .
The collection of all eigenvectors corresponding to a given eigenvalue is thus a subspace, called
an eigenspace of A. A collection of eigenvectors corresponding to different eigenvalues is
1998 by CRC PRESS LLC
necessarily linear-independent. It follows that a matrix of order n can have at most n distinct
eigenvectors. In fact, the eigenvalues of A are the roots of the nth degree polynomial
equation
det (A I) = 0
called the characteristic equation of A . (Eigenvalues and eigenvectors are frequently called
characteristic values and characteristic vectors.)
If the nth order matrix A has an independent collection of n eigenvectors, then A is said to have
a full set of eigenvectors. In this case there is a set of eigenvectors of A that is a basis for R
n
or, in
the complex case, C
n
. In case there are n distinct eigenvalues of A, then, of course, A has a full set
of eigenvectors. If there are fewer than n distinct eigenvalues, then A may or may not have a full
set of eigenvectors. If there is a full set of eigenvectors, then
D = S
1
A S or A = S D S
1
where D is a diagonal matrix with the eigenvalues of A on the diagonal, and S is a matrix whose
columns are the full set of eigenvectors. If A is symmetric, there are n real distinct eigenvalues of
A and the corresponding eigenvectors are orthogonal. There is thus an orthonormal collection of
eigenvectors that span R
n
, and we have
A = Q D Q
T
and D = Q
T
A Q
where Q is a real orthogonal matrix and D is diagonal. For the complex case, if A is Hermitian, we
have
A = U D U
H
and D = U
H
A U
where U is a unitary matrix and D is a real diagonal matrix. (A Hermitian matrix also has n distinct
real eigenvalues.)
References
Daniel, J. W., and Noble, B. 1988. Applied Linear Algebra. Prentice Hall, Englewood Cliffs, NJ.
Strang, G. 1993. Introduction to Linear Algebra. Wellesley-Cambridge Press, Wellesley, MA.
1998 by CRC PRESS LLC