Anda di halaman 1dari 10

MATH448.

01
Spring 2018
Quiz 4 (Take Home)
Due: April 18th, 2018

1. Let X1 ; X2 ; :::; Xn be iid Geometric ( ), where 0 < < 1. Assume n 2.


That is,
x 1
P (X1 = x) = (1 )
for x = 1; 2; 3; 4; :::

a. Find a su¢ cient and complete statistic for .

P
n
xi n
n
f (x1 ; x2 ; :::; xn ; ) = (1 )i=1
Pn
xi n
= (1 )i=1
1
P
n
=u xi ; h (x1 ; x2 ; :::; xn )
i=1

where
P
n
xi n
P
n
u xi ; = (1 )i=1
i=1 1
and
h (x1 ; x2 ; :::; xn ) = 1
n
X
T (X1 ; X2 ; :::; Xn ) = Xi is su¢ cient by the Factorization Theorem. And
i=1
we can write the probability mass function like

f (x; ) = exp ((x 1) ln (1 ) + ln )

= exp x ln (1 ) + ln
1
n
X
From this, we can see that T (X1 ; X2 ; :::; Xn ) = Xi is complete.
i=1

b. Find the Minimum Variance Unbiased Estimator (MVUE) for .


n
X d
We know that Xi = nb (n; ). So if n 2,
i=1

1
0 1
B C
B 1 C
EB
BXn
C
C
@ A
Xi 1
i=1
1
X 1 k 1 n k n
= (1 )
k 1 n 1
k=n
1
X (k 1) 1 n 1 (k 1) (n 1)
= (1 )
n 1 (n 1) 1
k=n
X1
l 1 n 1 l (n 1)
= (1 )
n 1 (n 1) 1
l=n 1

=
n 1
n
X
Since Xi is complete and su¢ cient, the MVUE is
i=1
b= n 1
n
X
Xi 1
i=1

Can you calculate the variance of the MVUE, and can you …nd the Cramer
Rao Lower Bound an unbiased estimator of ? All moments exist (as long as
n 2) due to
1 1
Xn
n 1
Xi 1
i=1

with probability one. So the variance should exist even for n = 2 unlike the
exponential distribution where we had to start at n = 3.
2. Let X1 ; X2 ; ::::; X2n 1 be iid N ( ; 1), where 2 R (n 1). Note this is
an odd number of random variables.

a. Show that e = X(n) is an unbiased estimator for .

See the solution to Non-Textbook, #4 of Homework 3.

2n
!
X1
b. Determine E X(n) j Xi .
i=1

2
The joint density is
2n 1
!
2n 1 1X 2
f (x1 ; x2 ; ::; x2n 1; ) = (2 ) 2
exp (xi )
2 i=1
2n 1 2n
!
2n 1 1X 2 X1 2n 1 2
= (2 ) 2
exp x + xi
2 i=1 i i=1
2
2n
!
X1
=u xi ; h (x1 ; ::; xn )
i=1

where ! !
2n
X1 2n
X1 2n 1 2
u xi ; = exp xi
i=1 i=1
2
and !
2n 1
2n 1 1X 2
h (x1 ; ::; xn ) = (2 ) 2
exp x
2 i=1 i
2n
X1
That shows Xi is su¢ cient for . We can write the density of X1 as
i=1

1 x2 2
f (x; ) = (2 ) 2
exp + x
2 2
2 2
x 1
= exp + x ln (2 )
2 2 2

This shows that the density is in the exponential class, which implies that
2n
X1
Xi is also complete. Further
i=1
E X =
and by part (a),
2n
!!
X1
E E X(n) j Xi = E X(n) =
i=1
2n
!
X1 2n
X1
Both X and E X(n) j Xi are functions of Xi , and both are unbi-
i=1 i=1
2n
X1
ased for . By completeness of Xi ,
i=1
2n
! !
X1
P E X(n) j Xi =X =1
i=1

3
for all . In other words,
2n
!
X1
E X(n) j Xi =X
i=1

3. Let X1 ; X2 ; :::; Xn be iid N ( ; 1) random variables. De…ne = P (X c),


where c is some constant. Find the MVUE for .
n
X
We’ve previously shown (in Problem 2) that Xi is su¢ cient and complete
i=1
for , so that means that X is also su¢ cient and complete for . Note that

E Ifx:x cg (X1 ) = P (X1 c) =


So Ifx:x cg (X1 ) is an unbiased estimator of . Observe that since for every
(c1 ; c2 ) 2 R2 ,
n
c2 c2 X
c1 X1 + c2 X = c1 + X1 + Xi
n n i=2
is normally distributed, since it’s a linear combination of independent normal
random variables. That shows that X1 ; X has a bivariate normal distribution
with parameters
2
X1 =1

2 1
X
=
n

X1 = X =

Cov X1 ; X p
= = nCov X1 ; X
p1
n

And
2
Cov X1 ; X = E X1 X
2
1+ n 1 2 2
= +
n n
1
=
n
So
1
=p
n

4
From a known theorem (See auxillary notes for proof.) on the bivariate
distribution, it follows that given X = t, X1 has a normal distribution with
mean
X1
X1 + (t X)
X
= +t
=t

and variance

2 2 n 1
X1 1 =
n

So therefore

E Ifx:x cg (X1 ) jX = t
= P X1 cjX = t
0 1
X1 t c t
= P @q q jX = tA
n 1 n 1
n n
r
n
= (c t)
n 1
By the Rao-Blackwell and Lehmann Sche¤e Theorems, the MVUE for is
r
n
b= c X
n 1
4. Let X1 ; X2 ; :::; Xn be iid N ( 1 ; 2) random variables, where 1 2 R and
2 > 0 (both unknown).

a. Find joint su¢ cient and complete statistics for ( 1 ; 2 ).

!
2
1 (x 1)
f (x; 1; 2) = (2 2)
2
exp
2 2
2 2
x 1 1 1
= exp + x ln (2 2)
2 2 2 2 2 2
From this, one can see by inspection that this is in the two parameter expo-
nential class. That implies
n
X n
X
T1 = Xi ; T2 = Xi2
i=1 i=1

5
are jointly su¢ cient for ( 1 ; 2) and complete. Now consider this transfor-
mation.
T1
V1 = X =
n
T12
T2 2 n
V2 = S =
n 1
This is a one-to-one transformation because we can solve for (T1 ; T2 ). By
a theorem we discussed in class, it follows that X; S 2 are also su¢ cient and
complete for ( 1 ; 2 ). This pair is more useful for part (b) than the …rst suggested
pair.
p
b. Using (a), …nd the MVUE for 1 2.

We know that X and S 2 are independent. We also know that

(n 1) S 2 d 2 d n 1
= n 1 = Gamma ;2
2 2

We need to …nd E (S). (And we’ve done this exercise before in Chapter 8.)

(n 1) S 2
t=
2
p
2 p
S=p t
n 1

r Z1 n2 1 t
2 t exp 2
E (S) = n 1 dt
n 1 2 2
n 1
0 2
r n n Z1 n2 1 t
2 22 2 t exp 2
= n n dt
n 1 2 n2 1 n 1 22 2
2 0
r
2 n p
2
= n 1 2
n 1 2

So by independence,
r !
n 1 n 1 p
2
E X n S = 1 2
2 2

Since r n 1
n 1 2
X n S
2 2

6
p
is a function of X; S 2 , and it’s unbiased, it must be the MVUE for 1 2.

5. Let X1 ; X2 ; :::; Xn be iid with common density


(
1 (x )
exp x
f (x; ) =
0 x<
a. Find joint su¢ cient and complete statistics for ( ; ).

n
!
n 1X n
f (x1 ; x2 ; :::; xn ; ; ) = exp xi + Ifx:x g x(1)
i=1
n
!
X
= u x(1) ; xi ; ; h (x1 ; x2 ; :::; xn )
i=1

where
n
! n
!
X 1X n
n
u x(1) ; xi ; ; = exp xi + Ifx:x g x(1)
i=1 i=1

and
h (x1 ; x2 ; :::; xn ) = 1
n
!
X
So X(1) ; Xi are jointly su¢ cient for ( ; ). To show completeness,
i=1
consider the one-to-one transformation

V1 = nX(1)
and
n
X n
X1
V2 = Xi nX(1) = (n j) X(j+1) X(j)
i=1 j=1

n
!
X
If we can show (V1 ; V2 ) is complete, then it follows that X(1) ; Xi is
i=1
also complete. Now by the independent spacings theorem,
d
(n j) X(j+1) X(j) = exp ( )

for 1 j n 1, and these n 1 random variables are independent.


Therefore,
d
V2 = Gamma (n 1; )
We also know (from the same theorem) that

7
d
n X(1) = exp ( )
So

P nX(1) v1 = P n X(1) v1 n
v1 n
=1 exp

and
1 v1 n
gV1 (v1 ) = exp

for v1 n . So the joint density of (V1 ; V2 ) is

1 v1 n v2n 2
exp v2

gV1 ;V2 (v1 ; v2 ) = exp n 1 (n 1)

1 v1 n v2n 2
exp v2

= n
exp
(n 1)

for v1 n and v2 0. Now suppose h : R2 ! R is a function such that for


all 2 R and all > 0,

E (h (V1 ; V2 )) = 0
Written in terms of a double integral, this is

Z1Z1 v2n 2
exp v2
1 v1 n
h (v1 ; v2 ) n exp dv2 dv1 = 0
(n 1)
n 0
or
Z1Z1
v1 v2
h (v1 ; v2 ) exp v2n 2
exp dv2 dv1 = 0
n 0

Using the Fundamental Theorem of Calculus and di¤erentiating both sides


with respect to we have
Z1
n v2
n h (n ; v2 ) exp v2n 2
exp dv2 = 0
0
or
Z1
v2
h (n ; v2 ) v2n 2
exp dv2 = 0
0

8
or
Z1 v2n 2 exp v2

h (n ; v2 ) n 1 dv2 = 0
(n 1)
0

The last integral in terms of expectations is


n
!!
X1
E h n ; Xi =0
i=1
n
X1
for all > 0. But since the density of X1 is in the exponential class, Xi
i=1
is complete over its range of densities indexed by > 0. So therefore
n
! !
X1
P h n ; Xi = 0 = 1
i=1

for all > 0. This is equivalent to

h (n ; v2 ) = 0

for all v2 > 0. Since this argument worked for any 2 R, it follows that

h (v1 ; v2 ) = 0
for v1 2 R and v2 > 0. Putting this all together, we can conclude that

P (h (V1 ; V2 ) = 0) = 1
for all 2 R and > 0, and
! thus we have that (V1 ; V2 ) are complete. In
n
X
turn, this means X(1) ; Xi are complete, since it’s a function of (V1 ; V2 )
i=1
(one-to-one correspondence).

b. Find the MVUE for and .

E X(1) =
n

E X(1) = +
n

E X = +
Multiply the top equation by negative one and add to obtain
n 1
E X E X(1) =
n

9
So the MVUE for is

b= n n
X X(1)
n 1 n 1
Now multiply the top equation by n to obtain

E X nE X(1) = (n 1)

So the MVUE for is

b= n 1
X(1) X
n 1 n 1

10

Anda mungkin juga menyukai