(b) the number of the women who never eat breakfast is at least as large as the number
of the men who never eat breakfast.
We want to compute P {M ≤ W } = P {M − W ≤ 0}. We may approximate M − W
by a normal distribution with µ = 50.4 − 47.2 = 3.2 and σ 2 = 37.7 + 36.1 = 73.8.
Hence
M − W − 3.2 −3.2
P {M ≤ W } = P {M − W ≤ 0} = P √ ≥√
73.8 73.8
−3.2
≈ Φ √ ≈ 1 − Φ(.37) = 1 − .6443 = .3557.
73.8
Problem 44
If X1 , X2 , X3 are independent random variables that are uniformly distributed over (0, 1),
compute the probability that the largest of the three is greater than the sum of the other
two.
1
Note that if, for example, X1 ≥ X2 + X3 , then X1 is automatically the largest of the
three. We want to compute
By symmetry, these three terms are all equal, so it suffices to compute the first term
P {X1 ≥ X2 + X3 }. Recall from Example 3a on page 252 that
y if 0 ≤ y ≤ 1,
fX2 +X3 (y) = 2−y if 1 < y ≤ 2,
0 otherwise.
Hence
Z 1 Z x Z 1 Z x Z 1
1 2 1
P {X1 ≥ X2 + X3 } = fX1 (x)fX2 +X3 (y)dydx = 1 · y dydx = x dx = .
0 0 0 0 0 2 6
So the probability that the largest of the three is greater than the sum of the other
two is
1 1 1 1
P {X1 ≥ X2 + X3 } + P {X2 ≥ X1 + X3 } + P {X3 ≥ X1 + X2 } = + + = .
6 6 6 2
2
(c) Compute P {X1 > X2 |X2 > X3 }.
Let X denote the median of the independent and identically distributed random
variables X1 , . . . , X2n+1 . Consider equation 6.2 on page 272. By replacing n with 2n + 1
and by choosing j = n + 1, we get that the probability density function of X is
(2n + 1)!
fX (x) = (F (x))n (1 − F (x))n f (x),
n!n!
where f is the common probability density function and F is the common cumulative
distribution function of the Xi ’s. Since X1 , . . . , X2n+1 are uniformly distributed on (0, 1),
this means that f (x) = 1 for 0 ≤ x ≤ 1 and F (x) = x for 0 ≤ x ≤ 1. Hence
(
(2n+1)! n
n!n!
x (1 − x)n for 0 ≤ x ≤ 1
fX (x) =
0 otherwise.
Now, consider the Beta distribution on page 218. Note that
Z 1
B(n + 1, n + 1) = xn (1 − x)n dx
0
Z 1
n
= xn+1 (1 − x)n−1 dx after integrating by parts
n+1 0
= ...
Z 1
n!
= x2n dx after integrating by parts
(2n) · . . . · (n + 1) 0
n!
=
(2n + 1)(2n) · . . . · (n + 1)
n!n!
= .
(2n + 1)!
Hence
(
(2n+1)! n
n!n!
x (1 − x)n for 0 ≤ x ≤ 1
fX (x) =
0 otherwise
(
1
B(n+1,n+1)
xn (1 − x)n for 0 ≤ x ≤ 1
=
0 otherwise.
3
So X has a Beta distribution with parameters (n + 1, n + 1).
Let X denote the value on the die, let Y be 1 if the coin lands heads and 0 if the coin
lands tails, and let g(X, Y ) denote the winnings. Then her expected winnings are
6 X 1 6
X X 1
E[winnings] = g(x, y)p(x, y) = 2x · p(x, 0) + x · p(x, 1)
x=1 y=0 x=1
2
6 6
X 1 1 1 5 X 5
= 2x · + x· = x= · 21 = 4.375.
x=1
12 2 12 24 x=1 24
Problem 12
A group of n men and n women is lined up at random.
(a) Find the expected number of men who have a woman next to them.
Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man
standing next to a woman and Xi = 0 otherwise. We want to compute
" 2n # 2n 2n
X X X
E Xi = E[Xi ] = P {Xi = 1}.
i=1 i=1 i=1
Note X1 = 1 only if the first person is male and the second is female. There are
2n(2n − 1) ways to choose the first two people. There are n ways to choose the first
person to be male and n ways to choose the second person to be female, and hence
n2 ways we can have Xi = 1. Hence
n2 n
P {X1 = 1} = = .
2n(2n − 1) 4n − 2
n
Similarly P {X2n = 1} = 4n−2
.
Now let’s find P {Xi = 1} for 1 < i < 2n. There are 2n(2n − 1)(2n − 2) ways to
choose the (i − 1)-th, i-th, and (i + 1)-th person. We have Xi = 1 if the three people
chosen are female male female, female male male, or male male female. Hence there
are
(n)(n)(n − 1) + (n)(n)(n − 1) + (n)(n − 1)(n) = 3n2 (n − 1)
ways we can have Xi = 1. Hence
3n2 (n − 1) 3n
P {Xi = 1} = = .
2n(2n − 1)(2n − 2) 8n − 4
4
Hence
" 2n
# 2n
X X n 3n 3n2 − n
E Xi = P {Xi = 1} = 2 · + (2n − 2) · = .
i=1 i=1
4n − 2 8n − 4 4n − 2
(b) Repeat part (a), but now assuming that the group is randomly seated at a round
table.
Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man
standing next to a woman and Xi = 0 otherwise. We want to compute
" 2n # 2n 2n
X X X
E Xi = E[Xi ] = P {Xi = 1}.
i=1 i=1 i=1
Problem 19
A certain region is inhabited by r distinct types of a certain species of insect. Each insect
caught will, independently of the types of the previous catches, be of type i with probability
r
X
Pi , i = 1, . . . , r, Pi = 1.
i=1
(a) Compute the mean number of insects that are caught before the first type 1 catch.
Let X denote the number of insects caught before the first type 1 catch. Then
P {X = x} = (1 − P1 )x P1 . Hence
∞
X
E[X] = x(1 − P1 )x P1
x=0
∞
X
= P1 x(1 − P1 )x
x=0
1 − P1
= P1 ·
P12
∞
X z
using the formula nz n = for |z| < 1
n=1
(1 − z)2
1 − P1
= .
P1
5
(b) Compute the mean number of types of insects that are caught before the first type 1
catch.
Let Xi denote the number of insects of type i caught before the first type 1 catch.
Let g(0) = 0 and let g(x) = 1 for positive integers x > 0. We want to compute
" r # r r r
X X X X
E g(Xi ) = E[g(Xi )] = P {g(Xi ) = 1} = P {Xi ≥ 1}.
i=2 i=2 i=2 i=2
Let X denote the number of insects caught before the first type 1 catch as in part
(a). Then
∞
X
P {Xi ≥ 1} = P {Xi ≥ 1, X = x}
x=0
since the events{Xi ≥ 1, X = x} are mutually
exclusive and have union{Xi ≥ 1}
∞
X
= (P {X = x} − P {Xi = 0, X = x})
x=0
X∞
= ((1 − P1 )x P1 − (1 − P1 − Pi )x P1 )
x=0
P1 P1
= −
P1 P1 + Pi
using the formula for a geometric series, twice
Pi
= .
P1 + Pi
Hence " #
r r r
X X X Pi
E g(Xi ) = P {Xi ≥ 1} = .
i=2 i=2 i=2
P 1 + Pi
Problem 24
A bottle initially contains m large pills and n small pills. Each day, a patient randomly
chooses one of the pills. If a small pill is chosen, then that pill is eaten. If a large pill
is chosen, then the pill is broken in two; one part is returned to the bottle (and is now
considered a small pill) and the other part is then eaten.
(a) Let X denote the number of small pills in the bottle after the last large pill has been
chosen and its smaller half returned. Find E[X].
Label the small pills initially present 1 though n and label the small pills created
by splitting a large one n + 1 though n + m. Let Ii = 1 if the i-thPpill remains after
the last large pill is chosen and let Ii = 0 otherwise. Then X = m+n i=1 Ii , so
n+m
X n+m
X
E[X] = E[Ii ] = P {Ii = 1}.
i=1 i=1
6
We will calculate P {Ii = 1} by considering the two cases 1 ≤ i ≤ n and n + 1 ≤
i ≤ n + m separately.
If 1 ≤ i ≤ n, then the i-th small pill is initially present. Pretend we keep choosing
pills until all of them are gone. It suffices to consider the order in which the i-th pill
and the m large pills are chosen. There are m + 1 of these pills, so the probability
that the i-th pill is chosen last among them is P {Ii = 1} = 1/(m + 1).
If n + 1 ≤ i ≤ n + m, then the i-th small pill is formed by breaking a large pill
in two. It suffices to consider the order in which the large pills and the i-th small
pill are chosen. Label the large pills 1 through m in the order in which they are
initially chosen. Let Ji denote this label for the large pill corresponding to the i-th
pill, that is, let Ji denote when the large pill is broken forming the i-th small pill.
By conditioning on the value of Ji , we get
m
X
P {Ii = 1} = P ({Ii = 1}|{Ji = j})P {Ji = j}.
j=1
The probability that the large pill corresponding to the i-th small pill is labeled j
out of the m large pills is P {Ji = j} = 1/m. Once the j-th large pill is broken
to form the i-th small pill, m − j large pills and the i-th small pill remain, so the
probability that the i-th small pill is chosen last among these m − j + 1 pills is
1
P ({Ii = 1}|{Ji = j}) = .
m−j+1
Hence
m
X
P {Ii = 1} = P ({Ii = 1}|{Ji = j})P {Ji = j}
j=1
m
X 1 1
= ·
j=1
m−j+1 m
m
X 1
= ,
k=1
km
by letting k = m − j + 1. Therefore
n+m
X
E[X] = P {Ii = 1}
i=1
n
X n+m
X
= P {Ii = 1} + P {Ii = 1}
i=1 i=n+1
m
1 X 1
=n· +m·
m+1 k=1
km
m
n X1
= + .
m + 1 k=1 k
7
(b) Let Y denote the day on which the last large pill is chosen. Find E[Y ].
There are a total of n + 2m days. On the Y -th day the last large pill is chosen and
for the remaining X days small pills are chosen. Here X is the number of small
pills in the bottle after the last large pill has been choosn, as in part (a). Thus
X + Y = n + 2m, so
m
n X1
E[Y ] = E[n + 2m − X] = n + 2m − E[X] = n + 2m − − .
m + 1 k=1 k
Problem 26
If X1 , X2 , . . . , Xn are independent and identically distributed random variables having
uniform distributions over (0, 1), find
8
Z 1 Z 1
E[max(X1 , . . . , Xn )] = ... max(x1 , . . . , xn )dx1 . . . dxn
0Z 0
min(X1 , . . . , Xn ) = 1 − max(1 − X1 , . . . , 1 − Xn ).
E[min(X1 , . . . , Xn )] = 1 − E[max(1 − X1 , . . . , 1 − Xn )]
n
=1−
n+1
by part (a), since each Xi has
the same distribution as 1 − Xi
1
= .
n+1
9
Z 1 Z 1
E[min(X1 , . . . , Xn )] = ... min(x1 , . . . , xn )dx1 . . . dxn
0Z 0
Recall k−1
N −i i
P {Xi = k} = for k ≥ 1,
N N
and E[Xi ] = N/(N − i). We compute
∞ k−1
2 N −iX 2 i N −i 1 + i/N N (N + i)
E[Xi ] = k = · 3
=
N k=1 N N (1 − i/N ) (N − i)2
10
where the second equality follows since
∞
X 1+x
k 2 xk−1 = for |x| < 1.
k=1
(1 − x)3
Hence
N (N + i) N2 iN
Var(Xi ) = E[Xi2 ] − E[Xi ]2 = 2
− 2
= ,
(N − i) (N − i) (N − i)2
and
N −1 N −1
X X iN
Var(X) = Var(Xi ) = .
i=1 i=1
(N − i)2
11