Anda di halaman 1dari 2

Math 206, Spring 2016

Assignment 6 Solutions

Due: March 4, 2016

Part A.
(1) Compute the matrix associated to T 1 , where T is the linear transformation associated to

1
A = 1
1

1
1 .
1

0
1
1

Solution. According to a theorem in class, to compute A1 we can augment A by I and then rowreduce. After the row reduction, the coefficient side should transform to the identity matrix, and the
augmented side will return A1 . Hence we compute

1 0 1 1 0 0
1 1
1 0 1 0
1 1
1 0 0 1

1 0 1
1 0 0
1 +3
0
1 1 0
0 1
0 1
2 1 0 1

0 1/2
1 0 0
3 +1
1
1
0 1 0
0 0 1 1 1/2

1
1 +2

0
1

0
1
1

1 0
2 +3

0 1
0 0

1/2
0 .
1/2

1
0
1

1
1
0

1
0
2

0
1
0
1
1
2

1 0 1
0
1 0 0
1 +3
0 0 1
1 1 0
0
1
0 1
2 1 0 1

0 0
1 0 1
1
0
0
(1/2)3
0
1 0 0 1
1
1
0
1 1
0 0
1 1 1/2 1/2

Hence we conclude that the matrix associated to T 1 is

0 1/2 1/2
1
1
0 .
1 1/2 1/2

(2) Let A Rnn be given. If there exists B Rnn that satisfies
AB = BA = I,
then we say that A is invertible, and that B is an inverse of A. Prove that if A is invertible, then its
inverse is unique. In other words, prove that if there exist matrices B, C Rnn so that
AB = BA = I

and

AC = CA = I,

then B = C. (You may use the fact that if X Rnn is given, then IX = XI = X, and the fact that
matrix multiplication is associative.)
[Its worth pointing out that since we know the inverse of an invertible matrix is unique, we can
use notation like A1 to (unambigously) denote this inverse. If there were more than one inverse for a
matrix A, then the notation A1 wouldnt be specific enough to be useful.
http://palmer.wellesley.edu/~aschultz/w16/math206

Page 1 of 2

Math 206, Spring 2016

Assignment 6 Solutions

Due: March 4, 2016

Solution. There are a few ways to prove this. Heres a purely algebraic way that revolves around the
rules that matrix multiplication obeys. We have
B = BI

(by our given fact)

= B(AC)

(since we assume I = AC)

= (BA)C

(by associativity)

= IC

(since we assume BA = I)

=C

(by our given fact).

There are other ways to prove this as well. Heres a proof that one of your classmates came up
with which is absolutely beautiful. First, note that since AB = I = AC, we have that AB = AC.
This means the two matrices AB and AC agree column-by-column: for each 1 i n, we have
Coli (AB) = Coli (AC). Now by the definition of matrix multiplication, this becomes
AColi (B) = Coli (AB) = Coli (AC) = AColi (C).
Now observe that the transformation T : Rn Rn is injective. hence if AColi (B) = AColi (C), then it
must be the case that Coli (B) = Coli (C). But this means that the matrices B and C agree columnby-column, and hence we have B = C. [This proof is awesome because its not really using the fact
that B and C act as multiplicative inverses, but merely the fact that they give the same product when
multiplied on the left by A. Hence its really a proof for a stronger result: if A Rnn corresponds to
an invertible transformation and B, C Rnn and AB = AC, then B = C. So multiplying on the left
by A is not only injective on vectors, its actually injective on matrices as well!]

(3) Suppose that A, B Rnn are invertible. Prove that AB is invertible, and that its inverse is B 1 A1 .
[Note: this is called the socks and shoes theorem. Its really useful, and is something that you should
memorize.
Solution. We can prove that AB is invertible by checking that it has a multiplicative inverse; fortunately, the theorem statement tells us precisely what we should suspect is its inverse, so now we only
have to carry through the calculation. Hence we only need to check that (AB)(B 1 A1 ) = I and that
(B 1 A1 )(AB) = I. Observe
(AB)(B 1 A1 ) = A(B(B 1 A1 ))
= A((BB

)A

(associativity of matrix multiplication)

(associativity of matrix multiplication)

= A(IA1 )

(since B 1 is the inverse of B)

= AA1

(since IX = XI = X for all X Rnn )


(since A1 is the inverse of A).

=I
For the other product, we get
(B 1 A1 )(AB) = ((B 1 A1 )A)B
= (B

(A

= (B

I)B

= BB 1

(associativity of matrix multiplication)

A))B

(associativity of matrix multiplication)


(since A1 is the inverse of A)
(since IX = XI = X for all X Rnn )

=I

(since B 1 is the inverse of B).

[Note: instead of performing the last calculation, we could use the theorem from class that says for a
square matrix, a right inverse must also be a left inverse.]

http://palmer.wellesley.edu/~aschultz/w16/math206

Page 2 of 2

Anda mungkin juga menyukai