Anda di halaman 1dari 5

Math 308 List of Definitions

W.R. Casper
December 7, 2013

Introduction

The time of the final is nigh (ie. soon, imminent, close; not to be confused
with neigh, which is what a horse says; also not to be confused with whatever
it is a fox says). For this final exam, you are expected to know the definitions
very precisely. Its important to note that knowing the definition is not
the same as knowing something equivalent to the definition. For example,
consider the following two statements
(a) An eigenvalue of A is a number such that the homogeneous system of
equations (A I)~x = ~0 has a nontrivial solution.
(b) An eigenvalue of A is a number such that det(A I) = 0.
Both of these statements are true: the first is how we defined an eigenvalue.
The second is equivalent to the first, but the fact that it is equivalent is a
theorem, not a definition. Therefore statement (a) is what one should write
down for the definition of an eigenvalue. While reading the following list of
definitions, be sure to keep this in mind. Also note that this list contains
only the new definitions since the midterm exam. For the definitions prior
to the midterm, please consult the midterm review sheet.

List of Definitions

Definition 1. Let A be an n n matrix. The rank of A is the dimension of


the range R(A).

Definition 2. Let A be an n n matrix. The nullity of A is the dimension


of the the nullspace N (A).
Definition 3. The inner product of two vectors ~v , w
~ in Rn is defined to be
~v w
~ = ~v T w
~=

n
X

vi wi .

i=1

This is also sometimes called the dot product of ~v and w,


~ and may also be
denoted as h~v , wi.
~

Definition 4. The norm k~v k of a vector in Rn is defined by k~v k = v v.


This is also called the magnitude of ~v .
Definition 5. A vector ~v is called an unit vector if k~v k = 1.
Definition 6. We say that two vectors ~v , w
~ in Rn are orthogonal if their
inner product ~u ~v = 0.
Definition 7. Let S = {~v1 , . . . , ~vr } be a set of vectors in Rn . We say that
S is an orthogonal set if ~vi ~vj = 0 for all i, j with i 6= j. We say that S is
an orthonormal set if it is an orthogonal set and all the vectors in S are unit
vectors.
Definition 8. Let V be a subspace of Rn and B = {~v1 , . . . , ~vr } be a basis
for V . If B is also an orthogonal set, we call B an orthogonal basis for V . If
B is an orthonormal set, we call B and orthonormal basis for V .
Definition 9. Let V and W be subspaces of Rn and Rm , respectively. A
linear function f from V to W is a function from V to W that satisfies the
following properties
f (~u + ~v ) = f (~u) + f (~v ) for every two vectors ~u, ~v in V
f (c~v ) = cf (~v ) for every vector ~v in V and every constant c
A linear function may also be called a linear map or linear transformation.
Definition 10. Suppose that V and W are subspaces of Rn and Rm , respectively, and that f is a linear function from V to W . We define the null space

and range of f similarly to the way we defined them for matrices. We define
the null space N (f ) of f to be
N (f ) = {~v : ~v is in V, and f (~v ) = ~0}.
We also define the range R(f ) of f to be
R(f ) = {w
~ :w
~ is in W, and w
~ = f (~v ) for some ~v in V }.
Definition 11. A linear transformation f is called an orthogonal transformation if it preserves the norms of vectors. In other words, f is called orthogonal
if
kf (~v )k = k~v k.
Definition 12. An nn square matrix matrix A is called an orthgonal matrix
if kA~v k = k~v k for all ~v in Rn . An alternative definition of an orthogonal
matrix that is popular in the literature is a matrix A satisfying the equation
AT A = AAT = I (in other words, a invertible matrix whose inverse is equal
to its transpose). These turn out to be equivalent definitions, and in this
case either will be acceptable for the final.
Definition 13. A least-squares solution to A~x = ~b is a vector ~v such that
kA~v ~bk kAw
~ ~bk for all ~x in R~n .
Definition 14. Let W be a subspace of Rn , and ~v be a vector in Rn (not
necessarily in V ). A least squares approximation to ~v is a vector ~u in W such
that
k~u ~v k kw
~ ~v k for all w
~ in W .
Definition 15. Let A be an n n matrix. An eigenvalue of A is a scalar
such that the linear homogeneous system of equations
(A I)~x = ~0,
has a nontrivial solution.
Definition 16. Let A be an n n matrix, and an eigenvalue of A. An
eigenvector of A with eigenvalue is a non-zero vector ~v such that ~x = ~v is a
solution to (AI)~x = ~0. (Notice that ~0 by definition is not an eigenvector!)
3

Definition 17. Let A be an n n. The r, sth minor matrix Mrs is the


(n 1) (n 1) matrix formed by deleting the rth row and sth column of
A. The value Ars = (1)r+s det(Mrs ) is called the r, sth cofactor of A.
Definition 18. Let A be an n n

a11
a21

A = ..
.
an1

matrix, whose i, jth entry is aij , ie.

a12 . . . a1n
a22 . . . a2n

..
..
...
.
.
an2 . . . ann

We define the determinant of A recursively as follows. If A is a 1 1 matrix,


ie. A = [a11 ], then we define det(A) = a11 , more generally, we define
det(A) =

n
X
j=1

a1j A1j =

n
X
(1)j+1 a1j det(M1j ),
j=1

where M1j represents the 1, jth minor matrix of A, and Aij = (1)j+1 det(M1j )
represents the 1, jth cofactor of A.
Definition 19. Let A be an n n matrix. The characteristic polynomial of
A is a polynomial, denoted A (t), and defined by
A (t) = det(A tI).
The equation A (t) = 0 is called the characteristic equation of A.
Definition 20. Let A be an n n matrix, and an eigenvalue of A. The
characteristic polynomial A (t) of A can be written in the factored form
A (t) = a(t r1 )(t r2 ) . . . (t rn ).
for some values a, r1 , r2 , . . . , rn . The algebraic multiplicity of is the number
of rj equal to in this factorization.
Definition 21. Let A be an n n matrix and an eigenvalue of A. The
eigenspace of is denoted E , and is defined to be the null space of AI. (In
other words, it is the set containing ~0 and all the eigenvectors with eigenvalue
.)
4

Definition 22. Let A be an n n matrix and an eigenvalue of A. The


geometric multiplicity of is defined to be the dimension of E .
Definition 23. Let A be an n n matrix. Then A is called defective if for
some eigenvalue of A, the algebraic multiplicity of is different from the
geometric multiplicity of .
Definition 24. A matrix A is said to be similar to C if there exists a third
invertible n n matrix B such that C = BAB 1 .
Definition 25. Let A be an n n

a11
a21

A = ..
.
an1

matrix, whose i, jth entry is aij , ie.

a12 . . . a1n
a22 . . . a2n

..
..
..
.
.
.
an2 . . . ann

Then A is called diagonal if aij = 0 whenever i 6= j.


Definition 26. Let A be an n n matrix. Then A is called diagonalizable
if A is similar to a diagonal matrix.

Anda mungkin juga menyukai