X
p
1
= p (1 p)2i =
.
=
2
1 (1 p)
2p
i=0
2. Suppose that all n men at a party throw their hats in the center of the room. Each
man then randomly selects a hat. Whats the probability that at least one of the
men gets his own hat? What happens as n ?
Solution: Let Ai be the event that man i gets his own hat, for i = 1, 2, . . . , n.
Then
P (At least one of the men gets his own hat)
= P (A1 A2 An )
=
n
X
P (Ai )
i=1
1
= n
n
1
= 1 +
2!
X
i<j
P (Ai Aj ) +
X
i<j<k
1
1
1
n
n
+
+ + (1)n+1
2 n(n 1)
3 n(n 1)(n 2)
n!
1
1
+ (1)n+1 .
3!
n!
1
e
as n .
3. A fair coin is continually tossed. Whats the probability that the pattern T HHH
occurs before the pattern HHHH?
Solution: When I first did this problem, I did it a very general, beautiful way
that ended up taking me 30 minutes. Then I saw the trivial answer! Namely, the
2
only way for HHHH to occur first is if you get that papttern on your first four
flips; otherwise, you are guaranteed that T HHH will occur first. Therefore, the
answer is 15/16.
4. A gambler has in his pocket a fair coin and a two-headed coin.
(a) He selects one of the coins at random, and when he flips it, it comes up heads.
Whats the probability that its the fair coin?
Solution: Let F, U denote fair and unfair, respectively. We use Bayes Rule
to find
Pr(H|F )Pr(F )
Pr(H|F )Pr(F ) + Pr(H|U )Pr(U )
1 1
= 1 12 2 1 = 1/3.
+1 2
2 2
Pr(F |H) =
(b) Suppose that he flips the coin n times, and it comes up heads each time.
Whats the probability that its fair?
Solution: As above,
Pr(HH H|F )Pr(F )
Pr(HH H|F )Pr(F ) + Pr(HH H|U )Pr(U )
1
1
n
= 1 21 2 1 = 1/(2n + 1).
+1 2
2n 2
Pr(F |HH H) =
3
7
Pr(each face appears at least once) = 6
5!/67 .
2
6. If X is a nonnegative continuous random
variable, and g is a differentiable function
R
with g(0) = 0, prove that E[g(X)] = 0 g 0 (t)Pr(X > t) dt. [Well also assume that
E[g(X)] is finite.]
Solution: By the Law of the Unconscious Statistician,
E[g(X)] =
=
=
=
Z
Z0
Z0
g(x)f (x) dx
f (x)[g(x) g(0)] dx
f (x)
Z0 Z x
0
=
=
Z
Z0
Z x
g 0 (t) dt dx
0
0
g (t)f (x) dt dx
g 0 (t)
t) |
0
g(t)[f (t)] dt
4
= P (X1 > y, X2 > y, . . . , Xn > y)
=
n
Y
P (Xi > y)
i=1
ny
= e
This implies that the p.d.f. of Y is g(y) = neny for y > 0; and so Y Exp(n).
n
Y
P (Xi < z)
i=1
= [1 ez ]n .
This implies that the p.d.f. of Z is h(z) = n[1 ez ]n1 for y > 0.
8. Calculate the m.g.f. of the Unif(a, b) distribution and use it to calculate the mean
and variance.
Solution: If X Unif(a, b), then the m.g.f. is
MX (t) = E[etX ] =
Z b
a
ebt eat
etx
dx =
.
ba
t(b a)
5
Similarly (but more tediously),
Var(X) = (b a)2 /12.
you
can
calculate
E[X 2 ]
and
then
9. Show that the sum of i.i.d. exponential random variables is a gamma random
variable.
Solution: Let X1 , X2 , . . . , Xn be i.i.d. Exp(). Then from class, we know that the
P
m.g.f. of Xi is MXi (t) = /( t), i = 1, 2, . . . , n; and so the m.g.f. of Y ni=1 Xi
is
!n
n
, for t < .
MY (t) = [MXi (t)] =
t
Meanwhile, the p.d.f. of a Gamma(n, ) (or Erlang) random variable Z is given by
n z n1 ez
,
(n)
g(z) =
for z > 0.
Z tz n n1 z
e z e
0
(n)
dz
n Z n1 (t)z
=
z e
dz
(n) 0
Z
n
=
un1 eu du (where u = ( t)z, with t < )
( t)n (n) 0
!n
=
(by definition of (n)).
t
Since MY (t) = MZ (t), the uniqueness of m.g.f.s gives our result.
10. Suppose that U Unif(0, 1). Find the p.d.f. of
1
`n(U ).
= Pr (`n(U ) y)
= Pr U ey
= 1 ey .
1
`n(U )
is
6
This immediately implies that Y Exp(); and so the p.d.f. is g(y) = ey , for
y > 0.
11. Suppose X, Y have joint p.d.f. f (x, y) = cxy for 0 < x < y < 1 for some c. Find
Corr(X, Y ).
Solution: First of all, note that
Z Z
1 =
<2
f (x, y) dx dy =
Z 1Z y
c
.
8
cxy dx dy =
f (x, y) dy =
<
Z 1
8xy dy = 4(x x3 ),
E[X] =
<
E[X 2 ] =
xfX (x) dx =
Z
<
Z 1
4(x2 x4 ) dx =
x2 fX (x) dx =
Z 1
f (x, y) dx =
<
E[Y ] =
E[Y ] =
8xy dy = 4y 3 ,
Z
<
yfY (y) dy =
y fY (y) dy =
Z 1
0 < y < 1.
4y 4 dy =
Z 1
4y 5 ) dy =
1
.
3
11
.
225
<
Z y
8
.
15
4(x3 x5 ) dx =
fY (y) =
0 < x < 1.
2
.
75
4
.
5
2
.
3
E[XY ] =
Z Z
xyf (x, y) dx dy =
<2
Z 1Z y
0
8x2 y 2 dx dy =
4
.
9
Cov(X, Y )
E[XY ] E[X]E[Y ]
Var(X)Var(Y )
= 0.4924.
Var(X)Var(Y )
12. Use Chebychevs inequality to prove the WLLN, i.e., if X1 , X2 , . . . , Xn are i.i.d.
with mean and finite variance, then for any > 0, we have
| > 0 as n .
Pr |X
Solution: (This was done in class.) By Chebychev, we have
| >
Pr |X
Var(Xi )
Var(X)
=
0 as n .
2
n2
Pr(Y 15)
10E[Xi ]
2
= .
15
3
Pr(Y 15) = Pr
Y E[Y ]
q
Var(Y )
15 E[Y ]
q
Var(Y )
15 10
Pr Nor(0, 1)
10
= 1 (1.581) = 0.0569.
8
Note that you might want to improve upon the above solution by employing a
continuity correction to take into account the fact that the Poisson is a discrete
distribution. This would result in the slightly different approximation
Pr(Y 15) = Pr(Y 14.5)
14.5 10
10
= 1 (1.423) = 0.0774.
Pr Nor(0, 1)
Of course, if you really want to check your answer, you can do so exactly, by
noting that Y Pois(10). Then
Pr(Y 15) = 1
14
X
e10 (10)y
y=0
y!
= 0.08346.
n
X
1
nk
= .
2
k=0 k!
k!
= Pr(Y n)
n E[Y ]
Pr Nor(0, 1) q
Var(Y )
= Pr (Nor(0, 1) 0) = 0.5.
15. Two vendors offer functionally identical products with mean lifetime 10 months.
The distribution of the lifetime of the product from the first vendor is Exp(),
while the distribution of the lifetime of the product from the second vendor is
Erlang2 (). If the objective is to maximize the probability that the lifetime of a
product is greater than 8 months, which of the two vendors should be chosen?
9
Solution: Let X1 and X2 be lifetimes of products from the first and second vendors,
respectively. Since X1 Exp() and X2 Erlang2 (), we have E[X1 ] = 1/ = 10
and E[X2 ] = 2/ = 10. Thus, = 0.1 and = 0.2. This immediately implies that
P (X1 8) = e0.18 = 0.449
P (X2 8) =
1
X
e0.28
i=0
(0.2 8)i
= 2.6e1.6 = 0.525.
i!
3
X
i=2
n i
p (1 p)ni = 3e2t 2e3t .
i
3 2
3 1
0p +75
p (1p)+(752+1000)
p (1p)2 +(753+1000)(1p)3 = 1206.6.
1
2
3
17. Suppose a machine has 3 components with i.i.d. Exp(0.1) lifetimes. Compute the
mean lifetime of the machine if it needs all 3 components to function.
10
Solution: Let X be a lifetime of the machine, and Xi be that of the ith component,
where Xi Exp(0.1), i = 1, 2, 3. Then
P (X > x) = P (min(X1 , X2 , X3 ) > x)
= P (X1 > x, X2 > x, X3 > x)
3
Y
P (Xi > x)
i=1
0.3x
= e
This implies that X Exp(0.3), and so the mean lifetime of the machine is 1/0.3.
ex
i=0 k=i+1
ex
i=0
px
= e
ex
k=1 i=0
k=1
k1
X
X
(x)
(1 p)k1 p =
i!
X
i=0
ex
(x)i
(1 p)k1 p
i!
(x)i X
(1 p)k1
p
i!
k=i+1
X
(x)
((1 p)x)i
(1 p)i = ex e(1p)x
e(1p)x
i!
i!
i=0
i
11
20. The lifetimes of two car batteries (brands A and B) are independent exponential
random variables with means 12 hours and 10 hours, respectively. What is the
probability that a brand B battery lasts longer than a brand A one?
Solution: Let A (resp., B) be lifetime of battery A (resp., B). Then
P (B > A) =
=
Z
Z0
0
1
= 0.455.
1 + 2
21. (Bonus Question) A couple has two kids and at least one is a boy born on a
Tuesday. Whats the probability that both are boys?
Solution: Let the events Bx [Gx ] = Boy [Girl] born on day x, x = 1, 2, . . . , 7
(x = 3 is Tuesday). The sample space for this experiment is
S = {(Gx , Gy ), (Gx , By ), (Bx , Gy ), (Bx , By ), x, y = 1, 2, . . . , 7}
(so |S| = 4 49 = 196).
Let C be the event that both are boys (with at least one born on a Tuesday)
= {(Bx , B3 ), x = 1, 2, . . . , 7} {(B3 , By ), y = 1, 2, . . . , 7}. Note that |C| = 13 (to
avoid double counting (B3 , B3 )).
Let D be the event that there is at least one boy born on a Tuesday =
{(Gx , B3 ), (B3 , Gy ), x, y = 1, 2, . . . , 7} C. So |D| = 27 (list them out if you
dont believe me). Then
Pr(C|D) =
Pr(C D)
Pr(C)
13/196
=
=
= 13/27.
Pr(D)
Pr(D)
27/196