Anda di halaman 1dari 5

Nov. 22, 2003, revised Dec.

27, 2003

Hayashi Econometrics

Solution to Chapter 1 Analytical Exercises


1. (Reproducing the answer on p. 84 of the book) (yX) (y X) = [(y Xb) + X(b )] [(y Xb) + X(b )] (by the add-and-subtract strategy) = [(y Xb) + (b ) X ][(y Xb) + X(b )] = (y Xb) (y Xb) + (b ) X (y Xb) + (y Xb) X(b ) + (b ) X X(b ) = (y Xb) (y Xb) + 2(b ) X (y Xb) + (b ) X X(b ) (since (b ) X (y Xb) = (y Xb) X(b )) = (y Xb) (y Xb) + (b ) X X(b ) (since X (y Xb) = 0 by the normal equations) (y Xb) (y Xb)
n

(since (b ) X X(b ) = z z =
i=1

2 zi 0 where z X(b )).

2. (a), (b). If X is an n K matrix of full column rank, then X X is symmetric and invertible. It is very straightforward to show (and indeed youve been asked to show in the text) that MX In X(X X)1 X is symmetric and idempotent and that MX X = 0. In this question, set X = 1 (vector of ones). (c) M1 y = [In 1(1 1)1 1 ]y 1 = y 11 y (since 1 1 = n) n n 1 =y 1 yi = y 1 y n i=1 (d) Replace y by X in (c). 3. Special case of the solution to the next exercise. 4. From the normal equations (1.2.3) of the text, we obtain (a) X1 X2 . [X1 . X2 ] . b1 b2 = X1 X2 y.

Using the rules of multiplication of partitioned matrices, it is straightforward to derive () and () from the above. 1

(b) By premultiplying both sides of () in the question by X1 (X1 X1 )1 , we obtain X1 (X1 X1 )1 X1 X1 b1 = X1 (X1 X1 )1 X1 X2 b2 + X1 (X1 X1 )1 X1 y X1 b1 = P1 X2 b2 + P1 y Substitution of this into () yields X2 (P1 X2 b2 + P1 y) + X2 X2 b2 = X2 y X2 (I P1 )X2 b2 = X2 (I P1 )y X2 M1 X2 b2 = X2 M1 y X2 M1 M1 X2 b2 = X2 M1 M1 y Therefore, b2 = (X2 X2 )1 X2 y (The matrix X2 X2 is invertible because X2 is of full column rank. To see that X2 is of full column rank, suppose not. Then there exists a non-zero vector c such that X2 c = 0. But d X2 c = X2 c X1 d where d (X1 X1 )1 X1 X2 c. That is, X = 0 for . This is c . a contradiction because X = [X1 . X2 ] is of full column rank and = 0.) . (c) By premultiplying both sides of y = X1 b1 + X2 b2 + e by M1 , we obtain M1 y = M1 X1 b1 + M1 X2 b2 + M1 e. Since M1 X1 = 0 and y M1 y, the above equation can be rewritten as y = M1 X2 b2 + M1 e = X2 b2 + M1 e. M1 e = e because M1 e = (I P1 )e = e P1 e = e X1 (X1 X1 )1 X1 e =e (d) From (b), we have b2 = (X2 X2 )1 X2 y = (X2 X2 )1 X2 M1 M1 y = (X2 X2 )1 X2 y. Therefore, b2 is the OLS coecient estimator for the regression y on X2 . The residual vector from the regression is y X2 b2 = (y y) + (y X2 b2 ) = (y M1 y) + (y X2 b2 ) = (y M1 y) + e (by (c)) = P1 y + e. 2 (since X1 e = 0 by normal equations). X2 X2 b2 = X2 y. (since M1 is symmetric & idempotent)

This does not equal e because P1 y is not necessarily zero. The SSR from the regression of y on X2 can be written as (y X2 b2 ) (y X2 b2 ) = (P1 y + e) (P1 y + e) = (P1 y) (P1 y) + e e This does not equal e e if P1 y is not zero. (e) From (c), y = X2 b2 + e. So y y = (X2 b2 + e) (X2 b2 + e) = b2 X2 X2 b2 + e e (since X2 e = 0). (since P1 e = X1 (X1 X1 )1 X1 e = 0).

Since b2 = (X2 X2 )1 X2 y, we have b2 X2 X2 b2 = y X2 (X2 M1 X2 )1 X2 y. (f) (i) Let b1 be the OLS coecient estimator for the regression of y on X1 . Then b1 = (X1 X1 )1 X1 y = (X1 X1 )1 X1 M1 y = (X1 X1 )1 (M1 X1 ) y =0 (since M1 X1 = 0).

So SSR1 = (y X1 b1 ) (y X1 b1 ) = y y. (ii) Since the residual vector from the regression of y on X2 equals e by (c), SSR2 = e e. (iii) From the Frisch-Waugh Theorem, the residuals from the regression of y on X1 and X2 equal those from the regression of M1 y (= y) on M1 X2 (= X2 ). So SSR3 = e e. 5. (a) The hint is as good as the answer. (b) Let yX, the residuals from the restricted regression. By using the add-and-subtract strategy, we obtain y X = (y Xb) + X(b ). So SSRR = [(y Xb) + X(b )] [(y Xb) + X(b )] = (y Xb) (y Xb) + (b ) X X(b ) But SSRU = (y Xb) (y Xb), so SSRR SSRU = (b ) X X(b ) = (Rb r) [R(X X)1 R ]1 (Rb r) = R(X X) = X(X X) = P. (c) The F -ratio is dened as F (Rb r) [R(X X)1 R ]1 (Rb r)/r s2 3 (where r = #r) (1.4.9)
1

(since X (y Xb) = 0).

(using the expresion for from (a))

(using the expresion for from (a)) (by the rst order conditions that X (y X) = R )

Since (Rb r) [R(X X)1 R ]1 (Rb r) = SSRR SSRU as shown above, the F -ratio can be rewritten as F = (SSRR SSRU )/r s2 (SSRR SSRU )/r = e e/(n K) (SSRR SSRU )/r = SSRU /(n K)

Therefore, (1.4.9)=(1.4.11). 6. (a) Unrestricted model: y = X + , where y1 1 x12 . . . . y = . , X = . . . .


(N 1)

yn

(N K)

xn2

. . . x1K . , .. . . . . . . xnK

1 . = . . . (K1) n

Restricted model: y = X + , 0 0 R = . . ((K1)K) .

R = r, where 1 0 ... 0 0 1 ... 0 , . .. . . . 0 0 1

0 . r = . . . ((K1)1) 0

Obviously, the restricted OLS estimator of is y y 0 y = . . So X = . . . . (K1) . 0 y

= 1 y.

(You can use the formula for the unrestricted OLS derived in the previous exercise, = b (X X)1 R [R(X X)1 R ]1 (Rb r), to verify this.) If SSRU and SSRR are the minimized sums of squared residuals from the unrestricted and restricted models, they are calculated as
n

SSRR = (y X) (y X) =
i=1

(yi y)2
n

SSRU = (y Xb) (y Xb) = e e =


i=1

e2 i

Therefore,
n n

SSRR SSRU =
i=1

(yi y)2
i=1

e2 . i

(A)

On the other hand, (b ) (X X)(b ) = (Xb X) (Xb X)


n

=
i=1

(yi y)2 .

Since SSRR SSRU = (b ) (X X)(b ) (as shown in Exercise 5(b)),


n n n

(yi y)2
i=1 i=1

e2 = i
i=1

(yi y)2 .

(B)

(b) F = (SSRR SSRU )/(K 1) n 2 i=1 ei /(n K) (


n i=1 (yi n

(by Exercise 5(c))

y)2 i=1 e2 )/(K 1) i n e2 /(n K) i=1 i 1)

(by equation (A) above)

(y y) /(K1) y) e(y/(nK) (y y)
n i=1 i n i=1 2 i 2 n 2 i=1 i n i i=1 2

n 2 i=1 (yi y) /(K n 2 /(n K) i=1 ei

(by equation (B) above)


n

(by dividing both numerator & denominator by


i=1

(yi y)2 )

R2 /(K 1) (1 R2 )/(n K)

(by the denition or R2 ).

7. (Reproducing the answer on pp. 84-85 of the book) (a) GLS = A where A (X V1 X)1 X V1 and b GLS = B where B (X X)1 X (X V1 X)1 X V1 . So Cov( GLS , b GLS ) = Cov(A, B) = A Var()B = 2 AVB . It is straightforward to show that AVB = 0. (b) For the choice of H indicated in the hint,
1 Var() Var( GLS ) = CVq C .

If C = 0, then there exists a nonzero vector z such that C z v = 0. For such z,


1 z [Var() Var( GLS )]z = v Vq v < 0

(since Vq is positive denite),

which is a contradiction because GLS is ecient.

Anda mungkin juga menyukai