s1 B = s2 s3 x1 X = x2 x3 a1 D = a2 a3 s1 Dx1 = s2 s3 a1 Dx2 = a2 a3 1 b1 b2 b3 b1 b2 b3 s1 s2 s3 c1 c2 c3 c1 c2 c3 c1 c2 c3
a1 Dx3 = a2 a3
b1 b2 b3
For Innitely many solutions, Dx1 = Dx2 = Dx3 = D = 0 For No solution, For one solution, D = 0 and Dx1,2,3 = 0 D=0
This rule can be extended to n n matrices Therefore, for the given equation k 1 Therefore, for One solution k 1 2 =0 k 2 x 1 = k y 1
k2 2 = 0 k= 2 From the constant matrix it is easy to ascertain that Innitely many solutions are not possible for this case, and not solution is possible when k = 2
Problem 2
The rules of Problem 1 can be applied here, extended to a n n matrix D AX = B X = A1 B Adj(A) A1 = |A| Thus for every Invertible matrix A we can have a set of solutions. For The special case of B = 0 If A is invertible, it has a solution X = 0, when A is non-invertible, well have to resort to elimination and pivots and then the rank of the matrik. Depending on the rank of the matrix, we have a free vector space and a bound vector space. We can assume the vales of the free columns and then solve for the pivot to get solutions.
Problem 3
The columns are linearly independent. When m = n also, the columns are independent and the adjoint of the matrix = 0 and so is the Determinant. Let c1 , c2 , c3 . . . cn be the columns of the vector. Then, c1 x1 + c2 x2 . . . cn xn = 0 Only when xk = 0 k This implies that Columns of A are linearly independant.
Problem 4
0, 1, 1 1, 1, 1, 0 1, 0, 0 3, 0, 0 , 1, 0, 0 , 0, 0, 0 are linearly independent , 0, 1, 0, 1 are linearly dependant
, 1, 0, 1, 0 , 0, 1, 0 , 5, 1, 0
, 0, 1, 1, 1
, 0, 0, 1 , a, b, 1 , 2, 5, 2
Problem 5
The dotted region in the following gure represents the constrained space.
Problem 6
For a Matrix A, A B = 0 denes the null-space of the vector.
let, a11 a21 A= . . . an1 Now, Ak = ak1 ak2 . . . b1 b2 B=. . . bn We know that A B A1 B A2 B D= . . . An B D = 0 => Ak B = 0 => Any vector from null space is orthogonal to row space akn a12 a22 . . . an2 ... ... . . . ... a1n a2n . . . ann
Problem 7
Problem 8
1 Eigen Vector V = 0 0 = 3, 1, 0 Sum of trace = 4 and sum of eigen values is 4. Thus proved. det(V ) = 0; Product of = 0.Thus Proved 1 .5 0 1 1 .5
For the second problem 0.7071 0 Eigen Vector V = 0.7071 = 2, 2, 2 Sum of trace = 2 and sum of eigen values is 2. Thus proved. det(V ) = 8; Product of = 8.Thus Proved 0.7071 0 0.7071 0 1.0000 0
Problem 9
a) We have the equation ( 2)(2 4 + 5) = 0 is complex Thus Indenite. b) = 1, 2, 3 The Matrix is positive denite
Problem 10
for, A= = 0.4142, 2.4142 implies Indenite 2 A = 1 1 = 0, 3, 3 Implies Positive semi-denite 2 1 1 0
1 2 1
1 1 2
0 A = 1 2
1 0 1
2 1 0
Problem 11
a) f= 1 2
No stationary point, and the Hessian is a zero matrix everywhere. Both concave and convex, because the eigenvalues of a zero square matrix are 0. b) f= Stationary point at (0, 0). H= 4 3 3 4 4x1 3x2 4x2 3x1
The Eigenvalues of this matrix are 1 and 7. Positive denite. Hence, (global) minimum, because function is quadratic. Concave. c) 2x1 + x2 f= x1 + 2 Stationary point at (2, 4). hessian : H= 2 1 1 0
Eigen values are 0.4142 , 2.4142 . Eigen values are of opposite signs. Thus it is Indenite, implying a Saddle point and not concave or convex. d) 2x1 + 1 f = 4x2 + 1 1 No stationary point, as the f can never attain 0 2 0 0 H = 0 4 0 0 0 1 7
The eigenvalues are 1, 2, 4. Positive denite. Hence, the function is concave. e) 2(x1 + x2 ) f= 2(x1 + x2 ) Innitely many stationary points exist : Any point of the form (k, k) will have gradient 0. H= 2 2 2 2
Eigenvalues : 0, 4. Positive semidenite implies it is concave. All minima are global minima, attaining the common value 0. f) f= 4x1 + 2x2 + 7 2x1 + 6x2 + 8
13 9 Stationary point : ( 10 , 10 ).
H=
4 2
2 6
The eigenvalues are 5 5. Positive denite. Hence the stationary point is the local minimum. since the function is quadratic, this minimum is global. Concave. g) The gradient : f=
1 Stationary point = ( 2 , 1).
Eigen values are 2.4853, 14.4853 Thus it is a saddle point and neither concave nor convex h) f= Stationary point = (1, 1). 2x1 + 2 6x2 + 6
H=
2 0
0 6
The eigenvalues are 2,6. Positive denite. Hence, minimum. The Hessian is constant throughout, and indicates that the function is concave. i) f= Stationary point : (1, 3 ). 4 2 4 4 4 2x1 4x2 + 1 4x1 + 4x2 + 1
H=
The eigen values are 1.1231, 7.1231 It is a saddle point and thus neither concave nor convex. j) 2 2 2x1 ex1 x2 f= 2 2 2x2 ex1 x2 Stationary point : (0, 0).
2 2 2 2 2 2
H=
H0,0 =
The eigen values are 2, 2. Hessian is negative denite. It is a global maximum, since (0,0) is the only point where the gradient is 0. The function is convex.
Problem 12
f (x) = a1 x1 + a2 x2 + a3 x2 a4 x1 x2 1 2 f= a1 + 2a2 x1 a4 x2 2a3 x2 a4 x1
H=
2a2 a4
a4 4 = 2a3 2
2 4
= 2,6 Implies it is positive denite. Now changing the value of a4 to 4 changes the behavior of the point to Positive semi-denite with = 0, 8 Changing a2 to -5 results in The point becoming a saddle point and changing a3 to -5 results in a saddle point too. Changing the value of a1 doesnt aect the nature of the point as the hessian is independant of the value of a1
10