Anda di halaman 1dari 25

CHAPTER 4

..
Discrete Random Variables
4.2

4.4

a.

p(0) + p(1) + p(2) + p(3) + p(4) = .09 + .30 + .37 + .20 + .04 = 1.00

b.

P(y = 3 or 4) = p(3) + p(4) = .20 + .04 = .24

c.

P(y < 2) = p(0) + p(1) = .09 + .30 = .39

a.

The list of all possible pairs of beach hotspots is shown below:


Beach Hotspot Pair
MBF
, CINY
MBF
, SCA
MBF
, MBNJ
MBF
, OCNJ
MBF
, SLNJ
CINY
, SCA
CINY
, MBNJ
CINY
, OCNJ
CINY
, SLNJ
SCA
, MBNJ
SCA
, OCNJ
SCA
, SLNJ
MBNJ , OCNJ
MBNJ , SLNJ
OCNJ , SLNJ

50

b.

The probabilities of each of these outcomes should all be equal if a random sampling
technique is employed. Therefore, they all have probability 1/15.

c.

The value of Y is found by determining the total number of beach hotspots in the
sample with a planar nearshore bar condition.

Chapter 4

Beach Hotspot Pair P(sample) Y


MBF
, CINY
1/15
0
MBF
, SCA
1/15
0
MBF
, MBNJ
1/15
1
MBF
, OCNJ
1/15
0
MBF
, SLNJ
1/15
1
CINY
, SCA
1/15
0
CINY
, MBNJ
1/15
1
CINY
, OCNJ
1/15
0
CINY
, SLNJ
1/15
1
SCA
, MBNJ
1/15
1
SCA
, OCNJ
1/15
0
SCA
, SLNJ
1/15
1
MBNJ , OCNJ
1/15
1
MBNJ , SLNJ
1/15
2
OCNJ , SLNJ
1/15
1
d.

The probability distribution for Y is found by grouping similar values of Y together in


the table.
Y=y
0
1
2

4.6

P(Y)
6/15
8/15
1/15

e.

P(Y 1) = P(1) + P(2) = 8/15 + 1/15 = 9/15

a.

p(1) = p(2) = p(3) = .5


Where 1: relay 1 works
2: relay 2 works
3: relay 3 works
The sample events for this experiment as well as the value of y are:

123
1 2 3c
1 2c 3
1 2c 3c

y
3
2
2
1

1
1c
1c
1c

23
2 3c
2c 3
2c 3c

y
2
1
1
0

Since p(1) = p(2) = p(3) = .5, each simple event has probability

Discrete Random Variables

1
of occurring.
8

51

y
p(y)

b.
4.8

0
1
8

1
3
8

2
3
8

3
1
8

P(current flows from A to B) = P ( y 1) =

7
8

Let y = the number of the three machining conditions with steel material and .25 in drill size
that will detect the flaw.
Since two of the eight machining conditions will detect the flaw, we list all possible
combinations of machining conditions which detect the flaw.
(1, 2), (1, 3), (1, 4), (1, 5), (1, 6), (1, 7), (1, 8), (2, 3), (2, 4), (2, 5), (2, 6), (2, 7), (2, 8),
(3, 4), (3, 5), (3, 6), (3, 7), (3, 8), (4, 5), (4, 6), (4, 7), (4, 8), (5, 6), (5, 7), (5, 8), (6, 7),
(6, 8), (7, 8)
There are 28 such combinations. We must assume that each of the 28 combinations are
equally likely.
P ( y = 0) = p(0) = 10 / 28
P ( y = 1) = p(1) = 15 / 28
P ( y = 2) = p(2) = 3/ 28
P ( y = 3) = p(3) = 0
y
p(y)

4.10

0
10/28

1
15/28

2
3/28

3
0

The probability distribution for Y = number of delphacid eggs on a blade of water hyacinth is
shown here:
Y=y
P(Y)

1
0.4

2
0.54

3
0.02

4
0.04

= E (Y ) = yp( y ) = 1(.4) + 2(.54) + 3(.02) + 4(.04) = 1.70


If repeated samples of water hyacinth blades were sampled, the average number of delphacid
eggs on the blades would be 1.7 eggs.
4.12

52

a. To find the probabilities associated with each value of y, we divide the frequency
associated with each value by the total sample size, 743. The probabilities appear in the
table:

Chapter 4

Frequency
First Digit of Occurrence
1
109
2
75
3
77
4
99
5
72
6
117
7
89
8
62
9
43
Total
743

b.

Probability
109 / 743 = .1467
75 / 743 = .1009
77 / 743 = .1036
99 / 743 = .1332
72 / 743 = .0969
117 / 743 = .1575
89 / 743 = .1198
62 / 743 = .0834
43 / 743 = .0579
1.0000

= E ( y ) = yp ( y ) = 1(.1467) + 2(.1009) + 3(.1036) + 4(.1332) + 5(.0969)


+ 6(.1575) + 7(.1198) + 8(.0834) + 9(.0579) = 4.6485

4.14

a.

For ARC a1:

= E(y) =

yp( y) = 0(.05) + 1(.10) + 2(.25) + 3(.60) = 2.4

The mean capacity for ARC a1 is 2.4


For ARC a2:

= E(y) =

yp( y) = 0(.10) + 1(.30) + 2(.60) = 1.5

The mean capacity for ARC a2 is 1.5


For ARC a3:

= E(y) =

yp( y) = 0(.10) + 1(.90) = .90

The mean capacity for ARC a3 is .90


For ARC a4:

= E(y) =

yp( y) = 0(.10) + 1(.90) = .90

The mean capacity for ARC a4 is .90


For ARC a5:

= E(y) =

yp( y) = 0(.10) + 1(.90) = .90

The mean capacity for ARC a5 is .90


For ARC a6:

= E(y) =

yp( y) = 0(.05) + 1(.25) + 2(.70) = 1.65

The mean capacity for ARC a6 is 1.65


b.

For ARC a1:

2 = E(y )2 =

( y )

p( y )

= (3 2.4)2(.60) + (2 2.4)2(.25) + (1 2.4)2(.10) + (0 2.4)2(.05)


= .216 + .040 + .196 + .288 = .74

= 2 = .74 = .8602
For ARC a2:

2 = E(y )2 =

( y )

p( y )

= (2 1.5)2(.60) + (1 1.5)2(.30) + (0 1.5)2(.10)


= .15 + .075 + .225 = .45

= 2 = .45 = .6708

Discrete Random Variables

53

For ARC a3:

2 = E(y )2 =

( y )

p( y )

= (1 .9)2(.90) + (0 .9)2(.10)
= .009 + .081 = .09

= 2 = .09 = .3
For ARC a4:

2 = E(y )2 =

( y )

p( y )

= (1 .9)2(.90) + (0 .9)2(.10)
= .009 + .081 = .09

= 2 = .09 = .3
For ARC a5:

2 = E(y )2 =

( y )

p( y )

= (1 .9)2(.90) + (0 .9)2(.10)
= .009 + .081 = .09

= 2 = .09 = .3
For ARC a6:

2 = E(y )2 =

( y )

p( y )

= (2 1.65)2(.70) + (1 1.65)2(.25) + (0 1.65)2(.05)


= .08575 + .105625 + .136125 = .3275

= 2 = .09 = .5723
4.16

Since the cost of firing each pin is $200, we get the following probability distribution for the
cost:
cost
p(cost)
a.

$200
6
10

$400
3
10

$600
1
10

= E[cost ] = cost p (cost)


6
3
1
= $200 + $400 + $600
10
10


10
= $300

b.

2 = E ( y 2 ) = (cost mean)2 p(cost)


6
3
1
= (200 300) 2 + (400 300) 2 + (600 300) 2
10
10


10
= 18,000

c.

We would expect the inspection cost to fall within a range of 2 .

= 2 = 18000 = $134.164
2 300 2(134.164) 300 268.328 $31.672 to $568.328

54

Chapter 4

4.18

From Exercise 4.4, the probability distribution for Y is:


Y=y
0
1
2

P(Y)
6/15
8/15
1/15

Using Theorem 4.4, 2 = E (Y 2 ) 2

= E (Y ) = yp ( y ) = 0(6 /15) + 1(8 /15) + 2(1/15) = 10 /15 = .6667


E (Y 2 ) =

p (Y ) = 02 (6 /15) + 12 (8 /15) + 22 (1/15) = 12 /15 = 0.80

2 = E (Y 2 ) 2 = 0.80 .6667 2 = 0.3556


= 2 = 0.3556 = 0.5963
We expect the number of beach hotspots in the sample with a planar nearshore bar condition
to fall between 2 .6667 2(.5963) .6667 1.1926
(0.5259, 1.8593).
4.20

From Exercise 4.9, the probability distribution of y is:


y
1
2
3

p(y)
.6
.3
.1

c = 200 + 100y

Using Theorems 4.1, 4.2, and 4.3,


E (c) = E (200 + 100 y ) = E (200) + E (100 y ) = 200 + 100 E ( y )

Where E ( y ) = =

yp( y) = 1(.6) + 2(.3) + 3(.1) = .6 + .6 + .3 = .15


all y

Thus, E (c) = 200 + 100(.15) = 350


2
2
V (c) = E ( c ) = E ( 200 + 100 y 350 )

2
= E (100 y 150 ) =

(100 y 150 )

p( y)

all y

= (100(1) 150 ) (.6) + (100(2) 150 ) (.3) + (100(3) 150 ) (.1)


= 1500 + 750 + 2250 = 4500
2

Discrete Random Variables

55

4.22

Theorem 4.2 says:


E (cy ) = cE ( y ) if c is a constant.
From Definition 4.5,
E (cy ) =

cyp( y) = c yp( y) = cE ( y)
all y

4.24

a.

all y

The experiment consists of n = 20 trials. Each trial results in an S (guppy survived after
5 days) or an F (guppy did not survive after 5 days). The probability of success, p, is
.60 and q = 1 .60 = .40. We assume the trials are independent. Therefore, y has a
binomial distribution with n = 20 and p = .60. The probability distribution for y is:

n
20
p ( y ) = p y q n y = .60 y (.40) 20 y
y
y

4.26

b.

20
20!
p (Y = 7) = .607 (.40) 207 =
.607 (.40)13 = 0.0146
7
7!13!

c.

p (Y 10) = 1 P (Y 9) = 1 0.1275 = 0.8725 from Table 2 in Appendix B.

a.

The experiment consists of n = 10 trials. Each trial results in an S (bridge will have an
inspection rating of 4 or below in 2020) or an F (bridge does not have an inspection
rating of 4 or below in 2020). The probability of success, p, is .09 and q = 1 .09 = .91.
We assume the trials are independent. Therefore, y has a binomial distribution with
n = 10 and p = .09. The probability distribution for y is:
n
10
p ( y ) = p y q n y = .09 y (.91)10 y
y
y
p (Y 3) = 1 P(Y 2) = 1 [ P( y = 0) + P ( y = 1) + P( y = 2)]
10

10
10
= 1 .090 (.91)100 + .091 (.91)101 + .092 (.91)102
1
2
0

= 1 0.9460 = .0540

4.28

56

b.

Since the probability of observing this outcome is small, we would question the validity
of the engineers forecast of 9%.

a.

Let y = number of beach trees damaged by fungi in 20 trials. Then y is a binomial


random variable with n = 20 and p = .25.

Chapter 4

P ( y < 10) = P ( y = 0) + P ( y = 1) + + P( y = 9)
20
20
20
20
= .250.7520 + .251.7519 + .252.7518 + + .259.7511
0
1
2
9
= .0032 + .0211 + .0669 + .1339 + .1897 + .2023 + .1686 + .1124 + .0609 + .0271
= .9861

b.

P ( y > 15) = P ( y = 16) + P ( y = 17) + + P( y = 20)


20
20
20
20
= .2516.754 + .2517.753 + .2518.752 + + .2520.750
16
17
18
20
= .000000356 + .000000027 + .000000001 + 0 + 0 = .000000384

c.

4.30

To find the number of trees that we expect to be damaged by fungi, we find


= n p = 20(.25) = 5

Let y = the number of foreign students in a random sample of 25 engineering students who
recently earned their Ph.D.
Then y is a binomial random variable with n = 25 and p = .70.
a.

25
25!
P ( y = 10) = (.70)10 (.30) 2510 =
(.70)10 (.30)15 = .0013249
10!15!
10

b.

P ( y 5) =.0000 from Table 2 in Appendix B.

c.

= np = 25(.7) = 17.5
2 = npq = 25(.7)(.3) = 5.25
= 5.25 = 2.29

d.

We expect the number of foreign students in the 25 sampled engineering students to


fall between 2 and + 2 .

2 = 17.5 2(2.29) = 17.5 4.58 = 12.92 13


+ 2 = 17.5 + 2(2.29) = 17.5 + 4.58 = 21.98 21
4.32

a.

The experiment consists of n = 10 trials. Each trial results in an S (contain shipping


order files in their computerized data base) or an F (do not contain shipping order files
in their computerized data base). The probability of success, p, is .99 and q = 1 .99 =
.01. We assume the trials are independent. Therefore, y has a binomial distribution with
n = 10 and p = .99.

b.

10
10! 7
p (Y = 7) = .997 (.01)107 =
.99 (.01)3 = 0.0001118
7
7!3!

c.

p (Y > 5) = 1 p (Y 5) = 1 .0000 = 1 from Table 2 in Appendix B.

Discrete Random Variables

57

d.

= np = 10(.99) = 9.9
2 = npq = 10(.99)(.01) = .099
= 2 = .099 = 0.3146
We expect most of the observations to fall within 3 9.9 3(.3146)
9.9 0.9439 (8.9561,10.8439).

4.34

a.

A sample of n = 4 particles are released. A success is when the particle is absorbed


into the inner duct wall. P ( s ) = .84. The random variable y is a binomial random
variable with n = 4, p = .84.
4
4!
P ( y = 4) = p(4) = (.84) 4 (1 .84)0 =
(.84) 4 (.16)0 = 1(.84) 4 (1) = .4979
4
0!4!

4
4!
P ( y = 3) = p(3) = (.84)3 (.16) 43 =
(.84)3 (.16)1 = .3793
1!3!
3

b.

Letting a success be a particle reflected by the inner wall duct means y is a binomial
random variable with n = 20, p = .16.
P(at least 10 are released) = P ( y 10)
= p(10) + p(11) + p(12) + + p(19) + p(20)
20
20
20
= (.16)10 (.84) 2010 + (.16)11 (.84) 2011 + " + (.16).20 (.84) 2020
10
11
20
= .0004267
20
P(exactly 10 are released) = P ( y = 10) = (.16)10 (.84) 2010 = .0003553
10

4.36

n
n
n
n
(q + p ) n = q n + q n1 p + q n2 p 2 + " + p n
0
1
2
n
= p (0) + p (1) + p (2) + " + p (n)
n

p( y ) = 1
y =0

58

Chapter 4

4.38

E y ( y 1) = E ( y 2 y ) = E ( y 2 ) E ( y ) = E ( y 2 )
From Exercise 4.37, E y ( y 1) = npq + 2
npq + 2 = E ( y ) 2
E ( y 2 ) = npq + 2

4.40

This experiment consists of 100 identical trials. There are four possible outcomes on each
trial with the probabilities indicated in the table below. Assuming the trials are independent,
this is a multinomial experiment with n = 100, k = 4, p1 = .40, p2 = .54, p3 = .02, and p4 = .04.
Result
One Egg
Two Eggs
Three Eggs
Four Eggs

P(50, 50, 0, 0) =
4.42

a.

Proportion
0.40
0.54
0.02
0.04

100!
(.40)50 (.54)50 (.02)0 (.04)0 = 0.0000533
50!50!0!0!

This experiment consists of 100 identical trials. There are four possible outcomes on
each trial with the probabilities indicated in the table below. Assuming the trials are
independent, this is a multinomial experiment with n = 100, k = 4, p1 = .29, p2 = .32,
p3 = .09, and p4 = .30.
Job Match
Yes, My job is a close match
No. Its engineering, but not what I studied
Job is not engineering related
Currently unemployed

P(40, 30, 10, 20) =

4.44

Proportion
0.29
0.32
0.09
0.30

100!
(.29) 40 (.32)30 (.09)10 (.30) 20 = 0.0000266
40!30!10!20!

b.

The number of readers we expect to answer Yes. My job is a close match is


1 = np1 = 100(.29) = 29

c.

The number of readers we expect to answer No. Its engineering, but not what I
studied is 2 = np2 = 100(.32) = 32

a.

This experiment consists of 200 identical trials. There are seven possible outcomes on
each trial with the probabilities indicated in the table below. Assuming the trials are
independent, this is a multinomial experiment with n = 200, k = 7, p1 = .22, p2 = .20,
p3 = .17, p4 = .17, p5 = .12, p6 = .05, and p7 = .07

Discrete Random Variables

59

Day of the week


Monday
Tuesday
Wednesday
Thursday
Friday
Saturday
Sunday

Proportion
0.22
0.20
0.17
0.17
0.12
0.05
0.07

P(50, 50, 30, 30, 20, 10, 10)


=

200!
(.22)50 (.20)50 (.17)30 (.17)30 (.12) 20 (.05)10 (.07)10
50!50!30!30!20!10!10

= 0.00000004
b.

The experiment consists of n = 200 trials. Each trial results in an S (detected on a


Monday) or an F (not detected on a Monday). The probability of success, p, is .22 and
q = 1 .22 = .78. We assume the trials are independent. Therefore, y has a binomial
distribution with n = 200 and p = .22.

p (Y 50) = 1 p (Y 49) = 1 0.82664 = 0.17336 using a computer program to find the


cumulative binomial probability.

4.46

This experiment consists of 10 identical trials. There are 3 paths available on each trial with
path probabilities of .25, .30, and .45. Assuming the trials are independent, this is a
multinomial experiment with n = 10, k = 3, p1 = .25, p2 = .30, p3 = .45.
10!
(.25) 2 (.30) 4 (.45)4 = .06539
2!4!4!

a.

P (2, 4, 4) =

b.

E ( y2 ) = n p2 = 10(.30) = 3

22 = V ( y2 ) = n pi (1 pi ) = n p2 (1 p2 ) = 10(.30)(.70) = 2.1
2 = 22 = 2.1 = 1.45
We expect y2 , the number of times path two is used, to fall within two standard
deviations of its mean.

2 3 2(1.45) 3 2.90 .10 to 5.90

60

Chapter 4

4.48

2 2 2 1
2
2
a + a (b + c) + (b + c)
0
1
2

[ a + (b + c)]2 =

2
2
2
2
= a 2 + ab + ac +
0
1
1
2

2 2 2
2 2
b + bc + c
1
2
0

2! 2 2! 1 1 2! 1 1 2! 2 2
2! 2
a +
ab +
ac +
b +
bc +
c
2!0!
1!1!
1!1!
0!2!
1!1!
2!0!

2! 2 0 0
2! 1 1 0
2! 1 0 1
2! 0 2 0
abc +
abc +
ab c +
abc
2!0!0!
1!1!0!
1!0!1!
0!2!0!
+

2! 0 1 1
2! 0 0 0
a bc +
abc
0!1!1!
0!0!2!

Substituting a = p1 , b = p2 , c = p3 yields:
= P(2, 0, 0) + P(1, 1, 0) + P(1, 0, 1) + P(0, 2, 0) + P(0, 1, 1) + P(0, 0, 2) = 1
4.50

Let S = a charged shower particle is observed and F = a charged shower particle is not
observed. We are given that the number of charged particles that must be observed in order
to detect r charged shower particles follows a negative binomial distribution with p = .75.
The probability that five charged particles must be observed in order to detect three charged
shower particles is:
5 1
4
4!
3
5 3
3
2
(.75)3 (.25) 2 = 0.1582
P (Y = 5) =
(.75) (.25) = (.75) (.25) =
3
1
2

2!
2!

4.52

Let y = the number of shuttle flights until a critical item fails. Then y is a geometric
1
random variable with p =
.
63

1
1
=
= 63
p 1/ 63

a.

b.

2 =

q
62 / 63
=
= 3906
2
p
(1/ 63) 2

= 2 = 3906 = 62.5
c.

The interval 2 will capture the number of missions before a critical item failure
occurs with probability of approximately .95.

2 63 2(62.5) 63 125 (62, 188) (0, 188)

Discrete Random Variables

61

4.54

a.

Y has a geometric distribution with p = .48.


P (Y = 1) = (.48) = 0.48
P (Y = 2) = (.48)(.52) = 0.2496
P (Y = 3) = (.48)(.52) 2 = 0.1298
P (Y = 4) = (.48)(.52)3 = 0.0675
P (Y = 5) = (.48)(.52) 4 = 0.0351
P (Y = 6) = (.48)(.52)5 = 0.0182
P (Y = 7) = (.48)(.52)6 = 0.0095

b.

The formula for P(Y = y) = pqy-1

c.

y = 1, 2, 3, .

1
1
=
= 2.083
p .48

2 =

q
.52
= 2 = 2.2569
2
p
.48

= 2.2569 = 1.502
d.

We know from Chebysheffs Theorem that at least 3/4 of the observations are within 2
standard deviations of the mean.

2 2.083 2(1.502) 2.083 3.004 (.921, 5.087)


Since we know y cannot be negative, the interval should be (0, 5.087).
4.56

Let S = particle is reflected and F = particle is absorbed. From Exercise 4.34, p = .16 and
q = .84. Let y = number of trials until the second particle is reflected. Then y has a negative
binomial distribution with r = 2.

P ( y > 5) = 1 P ( y 5) = 1 p (2) p(3) p(4) p (5)


2 1 2
22 3 1
2
3 2
= 1
.16 (.84)
.16 (.84)
2 2
3 2
4 1 2
4 2 5 1
2
5 2

.16 (.84)
.16 (.84)
4
2
5
2

1
2
3
= 1 .162 (.84)0 .162 (.84) .162 (.84) 2
0
1
2
4
.162 (.84)3
3

62

Chapter 4

= 1

1!
2!
3!
.162
.162 (.84)
.162 (.84) 2
0!(1 0)!
1!(2 1)!
2!(3 2)!
4!

.162 (.84)3
3!(4 3)!

= 1 .0256 .0430 .0542 .0607 = .8165


4.58

a.

Let x= number of facilities chosen that treat hazardous waster on-site in 10 trials. For
this problem, N = 209, r = 8, and n = 10.

E(x) = =

b.

4.60

a.

nr 10(8)
=
= .383
209
N

r N r 8 201
8! 201!

x
n
x
4
6

=
= 4!4! 6!95! = .0002
P(x = 4) =
209!
N
209

10!99!
n
10
Let y = number of defective items in a sample of size 4. For this problem, y is a
hypergeometric random variable with N = 10, n = 4, and r = 1. You will accept the lot
if you observe no defectives.
r N r 1 10 1
1! 9!

y
n

y
0
4

0
=
= 0!1! 4!5! = 1(84) = .4
P(y = 0) =
10!
210
N
10


4!6!
n
4

b.

If r = 2,
r N r 2 10 2
2! 8!

y
n

y
0
4

0
=
= 0!2! 4!4! = 1(70) = .333
P(y = 0) =
10!
210
N
10


4!6!
n
4

4.62

a.

r N r

y n y
p ( y ) =
N

n
7 3
7! 3!

2
2
63
= .30
p (2) = = 2!5! 2!1! =
10!
210
10

4!6!
4

Discrete Random Variables

63

b.

4.64

4.66

7 3

0 4
P ( y 1) = 1 p (0) = 1 = 1 0 = 1
10

4

Let y = number of firms operating in violation of regulations in 20 firms. Then y has a


hypergeometric distribution with N = 20, r = 5, and n = 3.

a.

5 20 5

0 3 0 1(455)

P ( y = 0) =
=
= .399
1140
20

3

b.

5 20 5

3
3 3 10(1)
P ( y = 3) =
=
= .009
1140
20

3

c.

P ( y 1) = 1 P ( y = 0) = 1 .399 = .601

Show that the mean of a hypergeometric distribution is

nr
.
N

r
r 1
r!
r (r 1)!
=
= r
First, y = y

y !(r y )! ( y 1)!(r y )! y 1
y
N r
We can write
as
n y

N 1 (r 1)

n 1 ( y 1)

N
N!
N ( N 1)!
N N 1
=
=
Also, =

n n!( N n)! n(n 1)!( N n)! n n 1


r N r
r 1 N 1 (r 1)
y
r

y n y
y 1 n 1 ( y 1)

=
Thus,
N
N N 1

n n 1
n
rn r 1 N 1 ( r 1)

N y 1 n 1 ( y 1)
=
N 1

n 1

64

Chapter 4

Let z = y 1. For N 1 elements, r 1 successes in N 1 elements, and n 1 trials, z has a


hypergeometric distribution.

Thus, E ( y ) =

all y

4.68

4.70

r N r
y

y n y = rn
N
N

n

r 1 N 1 (r 1)

y 1 n 1 ( y 1) = rn
N
N 1
all y

n 1

1.150 e 1.15 1.151 e1.15 1.152 e1.15


+
+
= 0.89015
0!
1!
2!

a.

P (Y 2) = P(0) + P(1) + P(2) =

b.

2 = = 1.15

c.

We would not expect the driver to exceed trips in the interval 3 1.15 3(1.15)
1.15 3.45 (2.30, 4.6) . The driver is not likely to exceed four trips.

a.

P (Y = 0) =

4.5 0 e 4.5
= 0.0111
0!

b.

P (Y = 1) =

4.51 e 4.5
= 0.04999
1!

c.

E ( y ) = = = 4.5

2 = = 4.5
= 4.5 = 2.121
4.72

4.74

a.

= = 4=2

b.

P(y > 10) = 1 P(y 10) = 1 .997 = .003 from Table 3, Appendix B, with
= 4. Since the probability is so small (.003), it would be very unlikely that the plant
would yield a value that would exceed the EPA limit.

a.

= = 1.57
2 = = 1.57
= = 1.57 = 1.253

b.

P ( y 3) = 1 P ( y 2) = 1 [ p(0) + p(1) + p(2) ]


1.570 e1.57 1.571 e1.57 1.57 2 e1.57
= 1
+
+

0!
1!
2!

= 1 [.2080 + .3266 + .2564] = 1 .7910 = .2090

Discrete Random Variables

65

4.76

a.

P (Y 20) = P (0) + P (1) + " + P(20)


=

b.

180 e18 181 e 18 182 e 18


1820 e 18
+
+
+" +
= 0.7307
0!
1!
2!
20!

P (5 y 10) = p(5) + p (6) + " + p(10)


=

c.

185 e 18 186 e 18
1810 e 18
+
+" +
= .03028
5!
6!
10!

2 = = 18
= 2 = 18 = 4.24
We would expect y to fall within 2
18 2(4.24) 18 8.48 9.52 to 26.48

4.78

d.

The trend would indicate that the number of occurrences were dependent with one
another. This casts doubts on the independence characteristic of the Poisson.

a.

Show for Poisson random variable y that 0 p( y ) 1. The probability function for y
is:
ye
p( y ) =
y!
For a Poisson random variable, > 0 and y 0. Thus, p ( y ) 0.
The Taylor expansion for e is 1 +

1
1!

2
2!

3
3!

4
4!

+"

Thus, any 1 term of the Taylor expansion is less than e , so a term of the expansion
times e < 1. Thus, 0 p ( y ) 1.

b.

Show for Poisson random variable, y,

p( y) = 1.
y =0

y =0

p( y ) =

0e
0!

1e
1!

2e
2!

3e
3!

+"

0 1 2 3

= e + +
+
+ " = e (e ) = 1
0! 1! 2! 3!

As shown above, the terms inside the parentheses are the Taylor expansion for e .

66

Chapter 4

c.

y =0

y =0

y( y 1) p( y) = y( y 1)

E [ y ( y 1)] =

y e

y e
y!

y 2e

( y 2)! = ( y 2)!

y =2

y =2

Let z = y 2, 2

z e
z!

z =0

since

z e

=1

z!

z =0

= 2,

E [ y ( y 1) ] = 2 = E ( y 2 ) E ( y )
E ( y 2 ) = 2 + E ( y) = 2 +
4.80

ty

m(t ) = E (e ) =

ty

y e
y!

y =0

Let = et = e ee

y e

y =0

4.82

m(t ) =

1 =

=e

y!

t
( et ) y
= e ee
y!
y =0

= e e e = e

( e t ) y e e
y!
y =0

) m(t ) = e ( e 1)

et 1

pet
pet
=
1 (1 p)et 1 et + pet

pet 1 et + pet pet et + pet


dm(t )

=
2

dt t =0
1 et + pet
t =0

pet p(et ) 2 + p 2 (et ) 2 + p(et ) 2 p 2 (et ) 2


=
2

1 et + pet

t =0

= p =1
t
t 2
p2 p
1 e + pe

t =0

Discrete Random Variables

pet

67

2
pet 1 et + pet pet (2) 1 et + pet et + pet
d 2 m(t )

2 =
=

t
t 4
dt t =0
1 e + pe

t =0

p [1 1 + p ] p(2) [1 1 + p(1 + p )]
2

2
p 3 + 2 p 2 (1 p) p [ p + 2(1 p )]
=
[ p]4
p4

p + 22p 2 p
= 2
p2
p

2 = 2 ( 1 ) =
2

4.84

[1 1 + p ]

2 p 1 1 p
2= 2
p2
p
p

Let y = the number of female fence lizards that will be resting from the sample of 20.
Then y is a binomial random variable with n = 20 and p = .95.
a.

P ( y 15) = 1 P ( y 14) = 1 .0003 = .9997 using Table 2 in Appendix B.

b.

P ( y < 10) = P ( y 9) = .0000 using Table 2 in Appendix B.

c.

= np = 200(.95) = 190
2 = npq = 200(.95)(.05) = 9.5
= 2 = 9.5 = 3.08
We would expect to observe a number in the interval 2 to + 2 .

2 = 190 2(3.08) = 190 6.16 = 183.84 184


+ 2 = 190 + 2(3.08) = 190 + 6.16 = 196.16 196
Observing fewer than 190 would be expected based on the interval above.
4.86

a.

If the number of respondents with symptoms does not depend on the daily amount of
water consumed, each of the 4 categories would be equally likely. Each would have a
probability of 1/4 or .25.

b.

P ( y1 = 6, y2 = 11, y3 = 13, y4 = 10) =

40!
(.25)6 (.25)11 (.25)13 (.25)10
6!11!13!10!

= .00104

68

Chapter 4

4.88

Let y be the number of engineers you choose in the sample with experience. y follows a
hypergeometric distribution with N = 5, r = 2, and n = 2.

a.

2 5 2
2! 3!

2 2 2 2!0! 0!3! 1

P ( y = 2) = p(2) =
=
= = .10
5!
10
5

2!3!
2

b.

P ( y 1) = p(1) + p (2)
2 5 2 2! 3!

1 2 1 1!1! 1!2! 6
p (1) =
=
= = .60
5!
10
5

2!3!
2
P ( y 1) = .60 + .10 = .70

4.90

a.

This is a multinomial experiment with n = 10 and p1 = .20, p2 = .15, p3 = .20,


p4 = .30, p5 = .10, and p6 = .05.
p(1, 2, 2, 4, 1, 0) =

b.

10!
(.20)1 (.15) 2 (.20) 2 (.30) 4 (.10)1 (.05)0 = .0055112
1!2!2!4!1!0!

i = npi = 100(.15) = 15
2 = npi (1 pi ) = 100(.15)(.85) = 12.75
= 2 = 12.75 = 3.571
We expect the number of specimens from the Eocene era to fall within 2 .

2 15 2(3.571) 15 7.142 7.858 to 22.142


4.92

Let y = number of arrivals in a 1 minute interval. Then y has a Poisson distribution with
=1.
a.

P ( y 3) = 1 P ( y = 0) P ( y = 1) P ( y = 2)
= 1

b.

10 e 1 11 e 1 12 e1

= 1 .3679 .3679 .1839 = .0803


0!
1!
2!

P ( y > 3) = 1 P ( y = 0) P ( y = 1) P( y = 2) P ( y = 3)
13 e 1
= .0803 .0613 = .019
3!
Yes. Since the probability of observing more than 3 arrivals is so small (.019), one can
assure the engineer that the number of arrivals will rarely exceed 3 per minute.
= .0803

Discrete Random Variables

69

4.94

a.

The sample space, values of x (in thousands) and associated probabilities are:
Sample
Space
x
p(x)
50, 50 100
.6(.6) = .36
50, 20 70
.6(.1) = .06
50,
30 20 .6(.15) = .09
50, 430
380
.6(.1) = .06
50, 950
900 .6(.05) = .03
20, 50 70
.1(.6) = .06
20, 20 40
.1(.1) = .01
20,
30
10 .1(.15) = .015
20, 430
410
.1(.1) = .01
20, 950
930 .1(.05) = .005
30, 50 20 .15(.6) = .09
30, 20
10 .15(.1) = .015
30,
30
60 .15(.15) = .0025
30, 430
460 .15(.1) = .015
30, 950
980 .15(.05) = .0075
430, 50
380
.1(.6) = .06
430, 20
410
.1(.1) = .01
430,
30
460 .1(.15) = .015
430, 430
860
.1(.1) = .01
430, 950 1380 .1(.05) = .005
950, 50
900 .05(.6) = .03
950, 20
930 .05(.1) = .005
950,
30
980 .05(.15) = .0075
950, 430 1380 .05(.1) = .005
950, 950 1900 .05(.05) = .0025

The probability distribution of x is:


x
100,000
70,000
40,000
20,000
10,000
60,000
380,000
410,000
460,000
860,000
900,000
930,000
980,000
1,380,000
1,900,000

70

p(x)
.36
.12
.01
.18
.03
.0225
.12
.02
.03
.01
.06
.01
.015
.01
.0025

Chapter 4

b.

E ( x) =

xp( x) = 100,000(.36) 70,000(.12) 40,000(.01) + + 1,900,000(.0025)


all x

= 126,000
V ( x) = E ( x 2 ) 2 =

p( x) 2

all x

= (100,000)2(.36) + (70,000)2(.12) + (40,000)2(.01) + + 1,900,0002(.0025)


126,0002 = 122,642,000,000
c.

The probability of doubling the $100,000 investment is:


P ( x 200,000) = P ( x = 380,000) + P( x = 410,000 + " + P ( x = 1,900,000)
= .12 + .02 + .03 + .01 + .06 + .01 + .015 + .01 + .0025
= .2775

d.

The probability of 2 dry holes is P ( x = 100,000) = .36


From Exercise 4.93, P ( y = 50,000) = .6
Thus, the probability of one dry hole is much greater than the probability of two dry
holes.

4.96

4.98

Let y be the number of the five blips that resulted in enemy aircraft. Then y has a binomial
distribution with n = 5 and p = .60.
a.

5
P ( y = 5) = (.60)5 (.40)0 = .0078
5

b.

P ( y 3) = 1 P ( y 2) = 1 .317 = .683 using Table 2 in Appendix B.

c.

5
P ( y = 0) = (.60)0 (.40)5 = .01024
0

Let y be the number of vehicles using the acceleration lane per minute. y has a Poisson
distribution with = 1.1.
a.

P ( y > 2) = 1 P( y 2) = 1 [ p(0) + p(1) + p(2)]


1.10 e 1.1 1.11 e 1.1 1.12 e1.1
= 1
+
+

0!
1!
2!

= 1 [.3329 + .3662 + .2014]


= 1 .9005 = .0995

Discrete Random Variables

71

b.
4.100

P ( y = 3) = p(3) =

1.13 e 1.1
=.0738
3!

Let y be the number of years before a nuclear war occurs. The random variable y is a
geometric random variable with p = .01.
p ( y ) = (.01)(.99) y 1
a.

y = 1, 2, 3,

The probability of a nuclear war occurring in the next 5 years is:


P ( y 5) = p(1) + p(2) + p(3) + p(4) + p(5)
= (.01)(.99)0 + (.01)(.99)1 + (.01)(.99) 2 + (.01)(.99)3 + (.01)(.99) 4
= .01 + .0099 + .009801 + .009703 + .009606
= .049

b.

In the next 10 years:


P ( y 10) = P ( y 5) + p(6) + p(7) + p (8) + p(9) + p (10)
= .049 + (.01)(.99)5 + (.01)(.99)6 + (.01)(.99)7 + (.01)(.99)8 + (.01)(.99)9
= .049 + .0095099 + .0094148 + .0093207 + .0092274 + .0091352
= .0956

c.

In the next 15 years:


P ( y 15) = P ( y 10) + p (11) + p(12) + p (13) + p(14) + p(15)
= .0956 + .0090438 + .0089534 + .0088638 + .0087752 + .0086875
= .1399

d.

In the next 20 years:


P ( y 20) = P ( y 15) + p(16) + p (17) + p(18) + p(19) + p(20)
= .1399 + .0086006 + .0085146 + .0084294 + .0083451 + .0082617
= .1821

4.102

e.

The assumption is that the possibility of a nuclear war in any given year is independent
of the possibility from all other years. This assumption probably is not valid.

a.

p ( x) =

xe
x!

5 x e 5
x!

P ( x > 1) = 1 P( x 1) = 1 [ p(0) + p(1) ]


50 e 5 51 e 5
= 1
+

1!
0!
= 1 [.0067 + .0337]
= 1 .0404 = .9596
72

Chapter 4

b.

p ( x) =

xe
x!

2.5 x e 2.5
x!

P ( x > 1) = 1 P( x 1) = 1 [ p(0) + p(1) ]


2.50 e 2.5 2.51 e 2.5
= 1
+

1!
0!
= 1 [.0821 + .2052]
= 1 .2873 = .7127
c.

Let a success be an industry exposing their workers to more than 1 ppm of the
solvent. Then y, the number of successes in the 55 industries, is a binomial random
variable with n = 55 and p = .12.
55
55!
P ( y = 0) = (.12)0 (1 .12)550 =
(.12)0 (.88)55 = .0008842
0!55!
0
We can approximate this using a Poisson distribution since n is large, p is small, and
= np = 55(.12) = 6.6 < 7.
P ( x = 0) =

d.

0e
0!

= e 6.6 = .001360

We need to find such that P ( x 1) is close to .88.


For = .5: P ( x 1) = .9098
For = 1.0: P ( x 1) = .7358
.5

4.104

1
2
2
m(t ) = et + e 2t + e3t
5
5
5

a.

= 1 =

dm(t )
dt t =0

2
2
1
d et + e 2t + e3t
5
5
5

=
dt
t =0

1 t 2 2t
2

e + e (2) + e3t (3)


5
5
5
t =0

1 0 4 2(0) 6 3(0) 11
e + e + e = = 2.2
5
5
5
5

Discrete Random Variables

73

b.

2 = 2 ( 1 )

d 2 m(t )
2 =
=
dt 2 t =0

4
6
1
d e t + e 2 t + e 3t
5
5
5

dt
t =0

1 t 4 2t
6

e + e (2) + e3t (3)


5
5
5
t =0

1 0 8 2(0) 18 3(0) 27
e + e + e =
= 5.4
5
5
5
5

2 = 2 ( 1 ) = 5.4 2.22 = .56


2

4.106

a.

P (t ) E (t ) =

( t ) y e

y =0

b.

y!

=e

( t 1)

y =0

( t ) y e t
y!

= e (t 1)

d e (t 1)
dP (t )
= e (t 1) = e (11) =
= E ( y) =
=
t =1
dt t =1
dt

t =1

2 = E ( y)2 2
E [ y ( y 1) ] = E ( y 2 ) E ( y ) E ( y 2 ) = E [ y ( y 1) ] + E ( y )
E [ y ( y 1) ] =

d e (t 1)
d 2 P (t )
= e (t 1) = 2e (11) = 2
=
2
t =1
dt
dt t =1

t =1

Thus, E ( y 2 ) = E [ y ( y 1) ] + E ( y ) = 2 +
Therefore 2 = E ( y 2 ) 2 = 2 + 2 =

74

Chapter 4

Anda mungkin juga menyukai