Anda di halaman 1dari 2

Assignment

1. Suppose that X1 , X2 , . . . , Xn are i.i.d. with p.m.f. P (X = 1) = 2(1)/(2) and P (X = 2) = /(2). Check whether the UMVUE of 1/ exists. 2. If X1 , X2 , . . . , Xn are i.i.d. Ber(), nd the UMVUE for (a) m , where m n. (b) P ( (c) P (
m i=1

Xi = k), where k m n. Xi > Xn ).

n1 i=1

3. Suppose that X1 , X2 , . . . , Xn are i.i.d. Ber(). Prove or disprove the following statements. (a) (b)
n i=1 (Xi n i=1 (Xi

X)2 depends only on the complete sucient statistic. X)2 is the UMVUE for (1 ).

4. Suppose that X1 , X2 , . . . , Xm are i.i.d. N (, 2 ), Y1 , Y2 , . . . , Yn are i.i.d. N (, 2 ), and they are independent. If is known, nd the UMVUE for . 5. If X1 , X2 , . . . , Xn are i.i.d. U (1 2 , 1 + 2 ) ( < 1 < ; 2 > 0), nd the UMVUE for 1 , 2 and 1 /2 . 6. Suppose that X1 , X2 , . . . , Xn are i.i.d. N (, 2 ), where 2 is known. Check whether the UMVUE of et attains the Cramer Rao lower bound ? 7. If X1 , X2 , . . . , Xn are i.i.d. U ( 1/2, + 1/2), show that (a) (X(1) + X(n) )/2 . (b) (X(1) + X(n) )/2 . 8. Suppose that X1 , X2 , . . . , Xn are i.i.d. with p.m.f. P (X = x) = e x /x!(1 e ) for x = 1, 2, . . .. Find the UMVUE for when n=1 and n=2. 9. Suppose that X1 , X2 , . . . , Xn are i.i.d. N (, 1), where = {0, 1}. If a statistic T = T (X1 , X2 , . . . , Xn ) takes only two values 0 and 1, show that T cannot be unbiased for . 10. Suppose that X1 , X2 , . . . , Xn are i.i.d. N (, 1) truncated at and (i.e. f (x) = Ce(x) x ). Show that X is the unique MLE for E(X1 ).
2

a.s.

/2

if

11. Suppose that X1 and X2 are i.i.d. U (, + 1), where = {0 , 0 + 0.5}. Considering the 0 1 loss function, nd a minimax estimator for . Cheque whether this minimax estimator is unique. 12. If X1 , X2 , . . . , Xn are i.i.d. Ber(), nd the minimax estimator for when (a) L(, ) = ( )2 /(1 ). (b) L(, ) = ( )2 . 13. Suppose that X1 , X2 , . . . , Xn are i.i.d. N (1 , 2 ), Y1 , Y2 , . . . , Yn are i.i.d. N (2 , 2 ), and they are independent. If we consider squared error loss function, show that X Y is minimax for 1 2 .

2 14. Suppose that X1 , X2 , . . . , Xn are i.i.d. f1 ,2 , where = {(0, ) {1, 2}}, f1 ,1 = N (0, 1 ) and f1 ,2 = 1 |x|/1 . 21 e

Find the MLE of 1 and 2 and prove their consistency.

15. If X P () and L(, ) = ( )2 , show that for any estimator , we have sup>0 R(, ) = . 16. Suppose that X N (, 1). Consider a squared error loss function and an improper prior () = e , < < . Show that the mean of the posterior distribution of cannot be minimax. 17. Prove or disprove the following statements. (a) If an estimator has constat risk (i.e. the value of R(, ) does not depend on ), it is minimax. (b) Bayes estimator, if unique, is admissible. (c) Minimax estimator, if unique, is admissible. (d) A minimax estimator can always be viewed as a Bayes estimator for a suitable choice of prior. (e) If an admissible estimator has constant risk, it is minimax. (f) A minimax estimator can have expected risk smaller than a Bayes estimator. 18. Suppose that X f , where = {1, 2}, f1 is U (0, 1), and f2 (x) = 1 + sin(2kx); 0 x 1, for k being a positive integer. Consider the uniform prior on and the 0-1 loss function. Show that the Bayes risk does not depend on the value of k. 19. Suppose that X f , where = {1, 2}, f1 is uniform over a d-dimensional unit hypercube, and f2 is uniform over the largest hypersphere inscribed in it. Consider the uniform prior on and the 0-1 loss function. Show that the Bayes risk converges to 0 as d . 20. Suppose that X N (, 2 ), where and are known, and = {1, 2}. Consider the uniform prior on and the 0-1 loss function. Find the Bayes estimator for and check whether the corresponding Bayes risk depends on and . If X and HX have the same distribution for all orthogonal matrix H, can you nd and ?

The exam will be held on 18th October (Tuesday) at 2:15 P.M. You will be asked to solve 2 out of these 20 problems chosen at random. So, two students may get two dierent sets of questions. You will get 40 minutes (time cannot be extended) to solve these two problems.

Anda mungkin juga menyukai