Anda di halaman 1dari 17

$ 5.

2 # 1

5.2 Orthogonal Complements and projections


Orthogonal complements Definition Let W be a subspace of R n and let x Rn.

(a) x is orthogonal to W x  W w W x  w w W xw 0 . (b) W   the orthogonal complement of W  the set of all vectors in R n that are orthogonal to W.  x Rn : x  W  x Rn : w W x  w  x Rn : w W x  w  0 . Theorem (5.9) Let W be a subspace of R n . Then: (a) W  is also a subspace of R n . (b) W W ,  0 . (c) if W  span w 1 , w 2 , , w k then W   v R n : v  w i  0 for i  1, 2, 1. Proof (a) . W , is not empty: w W  and y w w W W 0w 0 0 W W y w 0 x  y w  x w  y w  00  0 W. W W . W , is closed under addition: x W x  w  0 and w

,k .

x y

. W , is closed under scalar multiplication:

$ 5.2 # 2 k R and x w w W

W xw 0 W kx  w  k x  w  k 0  0 W.

kx

(b) . 0

W W and

W, : 0 W 0 x x W W,. : W x W, W, w W xw 0 W and W and x xx 0 x  0 x 0 . , w k W  v


, ,

The trivial space 0 is a subspace for every subspace of W

.W

0 W,

(c) W  span w 1 , w 2 , x

Rn :

i v  wi  0

W   span w 1 , w 2 , x , W  span w 1 , w 2 , x , w 1, w 2, x v , wk x  w i  0 for i  1, 2,

, wk , wk ,k

R n : v  w i  0 for i  1, 2,

,k

$ 5.2 # 3 Notation Let A be an m n matrix. Then:

(a) RS A  row A  the row space of A. (b) CS A  col A  the column space of A. (c) NS A  null A  the null space of A. Theorem (5.10) Let A be an m n matrix. Then: (a) RS A ,  NS A . (b) CS A ,  NS A T . Proof (a) x RS A , x , RS A x , every row of A Ax  0 x (b) CS A
,

NS A .
,

 RS A

 NS A T , by part (a).

Example 1. Given that

$ 5.2 # 4 1 A 2 3 4 1 1 AT  3 1 6 1 2 1 2 1 0 1 1 3 1 6 1 1 2 1 3 4 2 1 1 1 6 3 6 1 1 3 1 0 1 0 0 1 2 0 0 0 0 1 0 0 0 0 1 0 0 1 0 1 0 6 0 0 1 3 0 0 0 0 0 0 0 0  RREF A T 1 3 4 0  RREF A

1 0

2 1

a. Determine the dependency equation(s) of the columns in A. Solution The linear dependence or independence in RREF A correspondingly determines the linear dependence or independence in A. All the columns in RREF A containing leading 1s are linearly independent, that is, c 1 , c 2 and c 4 in RREF A are linearly independent. The columns c 3 and c 5 are linearly dependent on c 1 , c 2 and c 4 as follows: c 3  1c 1  2c 2  0c 4 c 5  1 c 1  3 c 2  4 c 4. Therefore the columns dependency equations in A are: col 3 A  col 1 A  2 col 2 A col 5 A  Check: col 1 A  3 col 2 A  4 col 4 A .

$ 5.2 # 5 1 col 1 A  2 col 2 A  2 3 4 2 1 1 2 1  3 0 1 6  col 3 A .

col 1 A  3 col 2 A  4 col 4 A 1  2 3 4 6  1 1 3 b. Determine the dependency equation(s) of the rows in A. Solution c 4  c 1  6 c 2  3 c 3 in RREF A T col 4 A T  col 1 A T  6 col 2 A T  3 col 3 A T row 4 A  row 1 A  6 row 2 A  3 row 3 A Check:  col 5 A . 3 1 1 2 1 4 1 1 2 1

$ 5.2 # 6 row 1 A  6 row 2 A  3 row 3 A   1 1 3 1 6 4 1 6 1 3 6 2 1 0 1 1 3 3 2 1 2 1

 row 4 A .

$ 5.2 # 7 c. Determine a basis in RREF A for RS A . Solution RS A  RS RREF A for RS A in RREF A . d. Determine a basis in A for RS A . Solution col 1, col 2, col 3 are linearly independent in RREF A T col 1 A T , col 2 A T , col 3 A T is a basis for CS A T in A T row 1 A , row 2 A , row 3 A is a basis for RS A in A since RS A  CS A T . e. Determine a basis in A for CS A . Solution col 1 A , col 2 A , col 4 A row 1, row 2, row 3 is a basis

f. Determine a basis in RREF A T for CS A . Solution row 1, row 2, row 3 is a basis for CS A in RREF A T .

g. Determine a basis for NS A . Solution x3  x5 2x 3 NS A  x3 4x 5 x5 h. Determine a basis for NS A T . Solution 3x 5  span 1 2 1 0 0 , 1 3 0 4 1

$ 5.2 # 8 x4 NS A T  6x 4 3x 4 x4 i. Show that RS A


,

1  span 6 3 1

 NS A .

Solution It is enough to show that each basis vector of RS A is orthogonal to each basis vector of NS A : 1 1 0 1 0 0 1 2 0 0 0 0 1 1 3 4 2 1 0 0 or 1 1 2 3 1 2 3 1 1 1 2 6 1 1 2 1 0 0 j. CS A
,

1 3 0 4 1  0 0 0 0 0 0

1 3 0 4 1  0 0 0 0 0 0

1 0

 NS A T .

Solution It is enough to show that each basis vector of CS A is orthogonal to each basis vector of NS A T :

$ 5.2 # 9 1 1 1 or 1 0 0 1 0 1 0 6 0 0 1 3 1 6 3 1  0 0 0 2 1 1 3 4 2 1 2 1 1 6 3 1  0 0 0

$ 5.2 # 10 Example 2. Let W be the subspace of R 5 spanned by 1 3 w1  5 0 5 Find a basis for W  . Example Solution Let 1 3 A w1 w2 w3  5 0 5 Now, W  CS A  RS A T W   CS A Therefore, 1 AT  1 0 so that 3 5 1 2 1 4 0 5 1 0 0 3 4 0 1 0 1 3 0 0 1 0 2  RREF A T 2 3 1 5
,

1 1 , w2  2 2 3 w3 

0 1 4 1 5

1 1 2 2 3

0 1 4 1 5

 RS A T

 NS A T  NS RREF A T

$ 5.2 # 11 3x 4 x4 W  x4 x5 4x 5 3x 5 2x 5  span 3 1 0 1 0 , 4 3 2 0 1

$ 5.2 # 12 Orhtogonal projections Definition Let v and u 0 in n . Then: the component v parallel to u  the projection of v onto u  proj u v  v  u u uu and the component of v orthogonal to u  perp u v Remark Since v  proj u v  perp u v it follows that perp u v  v proj u v . W and If W  span u , then w  proj u v , w ,  proj u v  perp u v W , . Therefore, there is a decompostion of v into the sum v  w  w , such that w W and w , W , . Definition Let S  u 1 , u 2 , , u k be an orthogonal basis for the subspace W in n . For any v in n , the component of v in W  proj W v v  u1 u  v  u2 u     1 2 u1  u1 u2  u2  proj u 1 v  proj u 2 v    proj u k v and v  uk uk  uk

uk

$ 5.2 # 13 the component of v orthogonal to W  perp W v  proj W , v  proj W v  v proj W v . x Example (1.) P  y z 3 v  1 . Find the orthogonal prjection of v onto P and the 2 component of v orthogonal to P. Solution (1.) 1 P, n  1 2 Therefore, :x y  2z  0 is a plane in 3 and let
,

$ 5.2 # 14 the component of v orthogonal to P  proj P , v  perp P v  proj n v  3 1  2 1 1 2 and the component of v in P  proj P v  v 3  1 2 Solution (2.) y P y z 2z  span u1  1 1 0 , u2  1 1 1 , perp P v 4 3 4 3 8 3 5 3 1 3 2 3 5  1 3 1 2   1 1 2 1 1 2 1 1 2  4 3 1 1 2  4 3 4 3 8 3 v n nn n

hence, P has an orthogonal basis Look at Example 5.3, page 367-368 . Therefore,

$ 5.2 # 15 the component of v in P  proj P v  proj u 1 v  proj u 2 v v  u1 u  v  u2 u  1 2 u1  u1 u2  u2 3 1  2 1 1 0 1  1 0 and the component of v orthogonal to P  proj P , v  perp P v  v 3  1 2 Check: 5 3 1 3 2 3 proj P v 4 3 4 3 8 3 1  4 3 1 2 . 2 3   1 1 0 1 1 0 1 1 1  5 3 1 3 2 3 1 1 0  3 1 2 1 1 1 5  1 3 1 2   1 1 1 1 1 1 1 1 1

$ 5.2 # 16 5 w  w ,  proj P v  perp P v  1 3 1 2  4 3 1 1 2  3 1 2 

Theorem (5.11) The Orthogonal Decomposition Theorem Let W be a subspace of n and let v n . Then there are unique vectors w in W and w , in W , such that v  w  w ,. Proof (a) . Show that the decomposition exists, that is, w W w , W , v  w  w , : Let S  u 1 , u 2 , , u k be an orthogonal basis for the subspace W in n and let v n . Let v  u1 u  v  u2 u    v  uk w  proj W v  1 2 u1  u1 u2  u2 uk  uk
k

uk


i1

v  ui ui  ui

ui

and let w ,  perp W v  v proj W v . v proj W v  v. Then w  w ,  proj W v  perp W v  proj W v  . w W: w  proj W v v  u1  u1  u1 . w, W, : i,

u1 

v  u2 u2  u2

u2   

v  uk uk  uk

uk

span S  W.

$ 5.2 # 17 w,  u i   v v proj W v  ui uk  ui

 v  ui

v  u1 u v  uk  1 u1  u1 uk  uk v  ui ui  ui , ui  ui 0 ui
2

ui  uj 

if i  uj
2

if i  j

 v  u i v  u i  0, which implies that w   S w   span S  W w  x a W y W,

W,

Theorem 5. 9 (c). :

(b) Show that the uniqueness of the decomposition exists, that is, v  x y x  w and y  w 

W and b w w a  b a  b

W , such that v  a  b w , , where w a W and b W,  0 w, W,

w  w,  a  b w ,  0 , since W

a  w and b  w , . Theorem (5.12)

Anda mungkin juga menyukai