Anda di halaman 1dari 3

5.4. (a) In Example 5.

5 we have seen that the moment generating function of a


2 t2 2
N (µ, 2 ) random variable is e 2 +µt . Thus if X̃ ⇠ N (0, 12) then MX̃ (t) = e6t
and MX̃ (t) = MX (t) for |t| < 2. But then by Fact 5.14 the distribution of X is the
same as the distribution of X̃.
114 Solutions to Chapter 5
(b) In Example 5.6 we computed the moment generating function of an Exp( )
distribution, and it was t for t < and 1 otherwise. Thus MY (t) has the same
moment generating function as an Exp(2) distribution in the interval ( 1/2, 1/2),
5.4.
hence(a)
byIn Example
Fact 5.14 we5.5havewe Yhave seen that the moment generating function of a
⇠ Exp(2).
2 t2 2
(c)(µ,
N We2cannot
) random identify the isdistribution
variable e 2 +µt . Thus of Z, ifasX̃there⇠ Nare many
(0, 12) random
then MX̃ (t)variables
= e6t
withMmoment
and generating functions that are infinite for t
X̃ (t) = MX (t) for |t| < 2. But then by Fact 5.14 the distribution of X is the
5. For example, all
Exp( ) distributions HOMEWORK
with < 5 have 7:thisDUE
property. ON NOV 28
same as the distribution of X̃.
(d) We
(b) cannot identify
In Example 5.6 we the distribution
computed of W , asgenerating
the moment there are many random
function of anvariables
Exp( )
where the moment
distribution, and itgenerating
was function
for t < is equal
and 1 to 2 at t =Thus
otherwise. 2. Here
M Y
are
(t) two
has examples:
the same
2 2 t ln 2
if W
Problem ⇠ N (0,
1. )
Suppose withX
moment generating function as
1 is=a then
discrete random variable with probability
2 an Exp(2) distribution in the interval ( 1/2, 1/2), mass func-
tion
hence by Fact 5.14 we have Y ⇠ Exp(2). 2 t2 ln 2 2
(2 )
2
(c) We cannot identifyMthe W1 (2) = e 2 =ofe Z,2 as there
distribution = eln 2are= many
2. random variables
1 1 3 4
with moment generating
(X = −1)) = functions that are infinite for t, P (X =For
5. example, all
If W2 ⇠PPoisson( with
7
, P (X
= e= ln 20) =
2 1 then 14
, P (X = 2) =
14
4) =
7
.
Exp( ) distributions with < 5 have this property.
(d) We cannot identify the distribution 2 of Wln 2
, (e
as2 there
1) are many random variables
Find the probabilityMmass = e (e of1) (X
W2 (2)function = e−e21)12 . = eln 2 = 2.
where the moment generating function is equal to 2 at t = 2. Here are two examples:
2 2 ln 2 t
if W1We
5.5. ⇠N (0, recognize
can ) with M =(t)
X 2 =thene3(e 1) as the moment generating function of a
3 34
Poisson(3) random variable. Hence2P t2
(X =ln24) 2 (2=
2) e
4! .
MW1 (2) = e 2 = e 22 = eln 2 = 2.
5.6. Then possible values of Y = (X 1) are 1, 4 and 9. The corresponding
ln 2
If W2 ⇠ Poisson(
probabilities are ) with = e2 1 then
2 ln 2
(e2 1)
P ((X 1)M = 1) = P (X(e2= 1)0 or
= eXe2 =1 2) = P=(X eln=2 0) + P (X = 2)
W2 (2) = e = 2.
1 3 2
3(et=1)
5.5. We can recognize MX=(t)14=+e14 7 as the moment generating function of a
4
Poisson(3) random 2variable. Hence P (X = 14) = e 3 34! .
P ((X 1) = 4) = P (X = 1) = ,
5.6. Then possible values of Y = (X 1)72 are 1, 4 and 9. The corresponding
4
probabilities are 1)2 = 9) = P (X = 4) = .
P ((X
7
P ((X 1)2 = 1) = P (X = 0 or X = 2) = P (X = 0) + P (Xx = 2)
5.7. The cumulative distribution function of X is FX (x) = 1 e for x 0 and
0 otherwise. Note that X > 01with 3probability
2 one, and ln(X) can take values from
= + =
Problem
the whole 2.R. Suppose X is 14 14 Find
Exp(λ). 7 the density of Y = ln X. Find the
probability
We havedensity1)function
P ((X 2
= 4) = of Y . = 1) = 1 ,
P (X
7 y
FY (y) = P2 (Y  y) = P (ln(X)  4 y) = P (X  ey ) = 1 e e ,
P ((X 1) = 9) = P (X = 4) = .
7
where we used ey > 0. From this we get
x
5.7. The cumulative distribution function ⇣ of X is yF⌘X0 (x) = 1 ye for x 0 and
0 otherwise. Note that X= d
> 0 with probability
fY (y) FY (y) = 1 e one, and e
= ln(X)
e y can take values from
e

the whole R. dy
for all 2 R.
Wey have
y
5.8. We first compute
FY (y) = P (Ythecumulative distribution
y) = P (ln(X) (X  eyof
 y) = Pfunction e e 1,  X  2,
) =Y 1. Since
2
we have 0  X  4, thus FY (y) = 1 for y 4 and FY (y) = 0 for y < 0.
where we used ey > 0. From this we get
d ⇣ ⌘
y 0 y
fY (y) = FY (y) = 1 e e = ey e
dy
for all y 2 R.
5.8. We first compute the cumulative distribution function of Y . Since 1  X  2,
we have 0  X 2  4, thus FY (y) = 1 for y 4 and FY (y) = 0 for y < 0.
Problem 3. Let X ∼ Unif[−1, 2]. Find the probability density function of Y =
X 2.
1
the whole R.
We have
ey
FY (y) = P (Y  y) = P (ln(X)  y) = P (X  ey ) = 1 e ,
where we used ey > 0. From this we get
d ⇣ ⌘0
ey ey
fY (y) = FY (y) = 1 e = ey
dy
2
for all y 2 R.
Solutions to Chapter 5 115
5.8. We first compute the cumulative distribution function of Y . Since 1  X  2,
we have 0  X 2  4, thus FY (y) = 1 for y 4 and FY (y) = 0 for y < 0.

For 0  y < 4 we have


p p p p
FY (y) = P (Y  y) = P (X 2  y) = P ( yX y) = FX ( y) FX ( y).
Di↵erentiating this we get the probability density function:
1 p 1 p
fY (y) = FY0 (y) = p fX ( y) + p fX ( y).
2 y 2 y

The probability density of X is fX (x) = 13 for 1  x  2 and zero otherwise. For


p p
0  y  1 then both fX ( y) and fX ( y) is equal to 13 , and for 1 < y < 4 we
p 1 p
have fX ( y) = 3 and fX ( y) = 0.
From this we get
8 1
>
< 3py for 0  y  1,
1
fY (y) = 6
p
y for 1 < y < 4,
>
:
0 otherwise.
5.9. (a) Using
Solutions the probability
to Chapter 5 mass function of the binomial distribution, and the 121
binomial theorem:
Xn ✓ ◆
Problem 4. Let X ∼ N (0, nX
MX1)(t)and = Y = e .pkFind (1 the p)n probability
k tk
e density function of
Y5.23.
. ThisWe can notice
random thatY M
variable (t) looks
isYcalled k
very similar
a log-normal to the variable
random moment and generating
is frequentlyfunc-
k=0
tioninofmathematical
used a Poisson random variable.
modeling ofX asset ◆⇠ Poisson(2), then MX (t) = e2(et 1) , and
n If✓Xprices.
n t k n k
MY (t) = MX (2t). From Exercise = 5.21 we(e seep)that
(1 Yp)has the same moment gener-
k
ating function as 2X, which means that they have the same distribution. Hence
k=0
= (et p + 1 p)n . 22
P (Y = 4) = P (2X = 4) = P (X = 2) = e 2 = 2e 2 .
(b) We have 2!
5.24. (a) Since0 Y = eX t> 0, twe have F nY 1(t) = 0 for t  0. For t  0,
E[X] = M (0) = npe pe p+1 t=0
= np
2 00
FY (t) =2 P2t(Y t t) = P (enX 2 t) = t0, t n 1
E[X ] = M (0) = (n 1)np e pe p+1 + npe pe p+1 t=0
since ex > 0 for all x 22 R. Next, for any t > 0
= (n 1)np + np.
FY (t) = P (Y  t) = P (eX  t) = P (X  ln t) = (ln t).
From these we get Var(X) = E[X 2 ] (E[X])2 = (n 1)np2 +np n2 p2 = np(1 p).
Di↵erentiating this gives the probability density function for t > 0:
5.10. Using the Binomial Theorem ✓ ◆
1 1 we get 1 (ln(t))2
fY (t) = ✓0 (ln t) = ◆30 X'(ln t) = p exp ✓ ◆230 k .
30 ✓ ◆2⇡t ✓ 2 ◆k
1 4t t t 30 4 kt 1
For t  0 the M (t) =
probability + e
5 density
5
=
function isk0. 5 e 5
.
k=0
(b) From the definition of Y we get that E[Y ] = E[(e ) ] = E[enX ]. Note that
n X n
nX tk
E[e
Since ]=M
this (n)sum
is Xthe is theof moment
terms ofgenerating
the form pfunction
k e , we of seeXthat
evaluated at n.
X is discrete. The
possible values canthe
We computed be moment
identifiedgenerating
with the exponents:
function forthese X ⇠ areN (0, 0,1,2,.
1) and. . , it30. The
is given
coefficients
MX (t)5.=are
Problem
by et the
Let
2
/2 corresponding
X. Thus
be an we have probabilities:
exponential random variable with rate parameter λ = 12 .
✓ ◆ ✓ ◆k ✓ ◆30 2k
30 4 n1 n
(a) Use Markov’s = k) = to findE[Y
P (Xinequality ] = e bound
an upper 2 .
, for
k =P0, (X 1, >
. . .6).
, 30.
k 5 5
(b) Use Chebyshev’s inequality to find an upper
5.25. We start by expressing the cumulative distribution function bound for P (X > 6).FY (y) of Y in
(c)
terms of FX . Since Y 4= |X 1| 0, we can concentrate on y 0. with
We Explicitly
can compute
recognize this the
as exact
the probability
probability P
mass (X > 6)
function and
of compare
a binomial the upper
distribution
bounds
with n = 30you andderived
p = 5 . above.
FY (y) = P (Y  y) = P (|X 1|  y) = P ( y  X 1  y)
= P (1 y  X  1 + y) = FX (1 + y) FX (1 y).
(In the last step we used P (X = 1 y) = 0.) Di↵erentiating the final expression:
d
fY (y) = FY0 (y) = (FX (1 + y) FX (1 y)) = fX (1 + y) + fX (1 y).
dy
1
We have fX (x) = 5 if 2  x  3 and zero otherwise. Considering the various
10 100 10

(c) The exact value of P (Y 16) can be computed for example by treating Y as
the number trials needed for the first success in a sequence of independent trials
with success probability p. Then

P (Y 16) = P (first 15 trials all failed) = q 15 = (5/6)1 5 ⇡ 0.0649.

We can see that the estimates in (a) and (b) are valid, although they are not very
close to the truth. 3

1
9.2. (a) We have E[X] = = 2 and X 0. By Markov’s inequality

E[X] 1
P (X > 6)  = .
6 3
1 1
(b) We have E[X] = = 2, Var[X] = 2 = 4. By Chebyshev’s inequality

Var(X) 4 1
P (X > 6) = P (X E[X] > 4)  P (|X E[X]| > 4)  = 2 = .
42 4 4
9.3. Let Xi be the price change between day i 1 and day i (with day 0 being
(c) Actual
today). probability
Then Cn C0 = P (X
X1 > X2=+e·−3
+ 6) ≈X
·· + 0.05 which
n . The is much much
expectation of Xismaller than
(for each i)
either bound.
is given by E[Xi ] = E[X1 ] = 0.45 · 1 + 0.5 · ( 2) + 0.05 · (10) = 0.05. We can also

197

Anda mungkin juga menyukai