Anda di halaman 1dari 31

Functions of Random Variables

Cyr Emile MLAN, Ph.D.


mlan@stat.uconn.edu

Functions of Random Variables

p. 1/31

Introduction
Text Reference: Introduction to Probability and Its
Applications, Chapter 7.
Reading Assignment: Sections 7.1-7.6, April 27

Often distributions are known for a random variable X1


or a sequence of random variables X1 , X2 , , Xn , but
what may be of interest is another random variable, U ,
which can be expressed as some function of the
random variable X1 , for e.g., U = h(X1 ), or some
function of the random variables X1 , X2 , , Xn , for
e.g., U = h(X1 , X2 , , Xn ).
Conversions and changes of scale (degrees
Fahrenheit to degrees Celsius).
The lifetime T of an electronic system depends on
Functions of Random Variables
p. 2/31
the sum X1 + X2 of two components.

Introduction
Let X1 , X2 , , Xn denote a random sample. A
statistic U = h(X1 , X2 , , Xn ) is computed from the
sample.

Question :
How to find the probability distribution of the
random variable U ?
The solution is centered around three basic ideas:
The distribution function method
The transformation method
The moment generating function method
Functions of Random Variables

p. 3/31

The Distribution Function Method


The distribution function method is the most direct
approach and it is typically used when dealing with
continuous random variables Xi s.
1. For each u, find the region U = u in the
(x1 , x2 , , xn ) space.
2. Then find the region u defined in terms of the xi s
for which U u.
3. For each u, find FU (u) = P (U u) by integrating the
probability density function f1 (x1 ) or the joint
probability density function f (x1 , x2 , , xn ) over the
region u .
4. The density of U is then obtained by differentiating
the distribution FU (u), that is, fU (u) = FU0 (u).
Functions of Random Variables

p. 4/31

The Distribution Function Method


Example 7.1:
Suppose that a random variable Y has probability density
function f (x) = 6x(1 x), 0 < x < 1. What is the probability
density function of U = 2X + 1?

Solution:
FX (y) =

x
0

6t(1 t) dt = 3x2 2x3




u1
=
FU (u) = P (U u) = P (2X + 1 u) = P X
2

2

3
u1
u1
3
2
2
2
for 1 < u < 3.


2

u1
3
u1
3
= (u 1)(3 u)
fU (u) = 3
2
2
4
Functions of Random Variables
for 1 < u < 3.

p. 5/31

The Distribution Function Method


Example 7.2:
Suppose that a continuous random variable X has probability
density function f (x). What is the probability density function of
U = aX + b where a > 0 and b are constants?

Solution:


ub
FU (u) = P (U u) = P (aX + b u) = P X
=
a


ub
FX
a


1
ub
0
fU (u) = FU (u) = fX
a
a
Remember to calculate the two endpoints where fU (u) integrate
to 1.

1
In general, fU (u) = fX
|a|

ub
a

for any a 6= 0

Functions of Random Variables

p. 6/31

Continuous Random Variables


Example 7.3:
The four day use of CPU time in hours, X, of an accounting
firm approximately follows the probability density function

3 x2 (4 x) , 0 x 4
64
f (x) =
0,
elsewhere
Find the distribution of U = 1 X/4.

Solution:

fU (u) = 4fX 4(1 u) . Thus,
(
12 u (1 u)2 , 0 u 1
fU (u) =
0,
elsewhere

Functions of Random Variables

Hence, U Beta ( = 2, = 3)

p. 7/31

The Distribution Function Method


Example 7.4:
The velocity of a gas molecule with mass m is a random
2
variable Y with probability density function f (x) = ax2 ebx for
x > 0 and a, b constants. What is the probability density
function of the kinetic energy U = (m/2)X 2 ?

Solution:
FU (u) = P
fu (u) =

m
2

FU0 (u)

X2

u =P

1
=
fX
2mu

X
r

2u
m

21/2 a 3/21 2bu/m


u
e
3/2
m

m
Hence, U Gam = 3/2, =
2b

2u
m

= FX

2u
m

a 2u 2bu/m
=
e
=
2mu m

Functions of Random Variables

p. 8/31

Joke

Patient: "Doctor, what I need is


something to stir me up."
"Something to put me in fighting trim."
"Did you put anything like that in this
prescription?"
Doctor: "No. You will find that in the bill."

Functions of Random Variables

p. 9/31

The Distribution Function Method


Example 7.5:
Let Y be a continuous random variable with density fX .
Consider U = X 2 . Find the density function fU . In
x+1
for 1 x 1.
particular, find fU when fX (x) =
2

Solution:
For u 0, FU (u) = P (U U ) = P (Y 2 u) = 0.
For u > 0, FU (u) = P (U u) = P (X 2 u) =

P ( u X u) = FX ( u) FX ( u)


1 
0
fU (u) = FU (u) = fX ( u) + fX ( u)
2 u
for u > 0 and 
zero elsewhere.


1
u+1 u+1
1
fU (u) =
+
=
2
2
2 u
2 u
Functions of Random Variables
for 0 < u 1

p. 10/31

The Distribution Function Method


Example 7.6:
Suppose that X1 and X2 are two independent
exponential random variables, each with the same
parameter . Let U = X1 + X2 . Find the probability
density function of U .

Solution:
The random variable U takes positive real values.
We have:
Z u Z ux

1 x1 / 1 x2 /
FU (u) = P (X1 + X2 u) =
e
e
dx1 dx2

0
0
Z u
1
x2 / (ux2 )/
e
=
e
dx1
2 0
1
1 21 u/
21 u/
=
u
e
=
u
e
2
(2) 2
2

Functions of Random Variables

Hence, U Gam( = 2, ).

p. 11/31

The Distribution Function Method


Example 7.7:
Two friends plan to meet at the library during a given 1-hour
period. Their arrival times are independent and randomly
distributed across the 1-hour period. Each agrees to wait for 15
minutes, or until the end of the hour. If the friend does not
appear during that time, she will leave. What is the probability
that the two friends will meet?

Solution:
Let X1 and X2 represents each persons arrival time in
(0, 1). Then due to independence, we have:

1 , 0 x ,x 1
1 2
fX1 ,X2 (x1 , x2 ) = fX1 (x1 )fX2 (x2 ) =
0 , elsewhere
Functions of Random Variables

p. 12/31

The Distribution Function Method


The event that the two friends will meet depends on the
time U between their arrivals, i.e., U = |X1 X2 |. The
random variable U takes values in the interval (0, 1)
Let 0 u 1. We have:
FU (u) = 1 P (|X1 X2 | > u) = 1 2
= 12

1
u

1 Z x1 u

dx1 dx2

(x1 u) dx1

= 1 (1 u)2
 2
7
1
3
=1
= .
Hence, P U
4
4
16
There is less than 50-50 chance that the two friends will
meet under this rule.


Functions of Random Variables

p. 13/31

The Distribution Function Method


Example 7.8:
Suppose that X1 and X2 are two independent binomial
random variables, each with the same probability of
success, p, but with m and n trials. Let U = X1 + X2 .
Find the probability mass function of U .

Solution:
The random variable U takes integer values from 0
to n + m.
We have:
 
 

n x1
m x2
p (1 p)nx1
p (1 p)mx2
x1
x2
x1 +x2 =u
X  n  m 
px1 +x2 (1 p)n+m(x1 +x2 )
=
x1
x2
x1 +x2 =u


n+m u
=
p (1 p)n+mu
u

P (U = u) =

Functions of Random Variables

Hence, U Binomial(n + m, p).

p. 14/31

The Distribution Function Method


Example 7.9:
A quality control manager samples from a large lots of items, testing
each item until r defective items have been found. Find the
distribution of Y , the number of items that are tested to obtain r
defective items?

Solution:
Let X denote the number of good items sampled prior to the rth
defective one and U the number of trials required to get r defective
items. We have U = X + r. Assuming that the probability p of
obtaining a defective item is constant from trial to trial, we have:
X NB(r, p)
Let u = r, r + 1, . . . . We have,
P (U = u) = P (X + r = u) = P (X = u r) =


u 1 r ur
p q
r1

Functions of Random Variables

p. 15/31

Joke

I never even believed in divorce


until after the first month after my
marriage.

Functions of Random Variables

p. 16/31

The Method of Transformations


Suppose U = h(X) and h is strictly increasing.

FU (u) = P (U u) = P h(X) u = P X


1
1
h (u) = FX h (u)
 1 
 d h (u)
0
1
fU (u) = FU (u) = fX h (u)
.
du
If h is strictly decreasing, then

FU (u) = P (U u) = P h(X) u = P X


1
1
h (u) = 1 FX h (u)
 1 
 d h (u)
0
1
fU (u) = FU (u) = fX h (u)
.
du
Functions of Random Variables

p. 17/31

The Method of Transformations


Theorem 7.1:
If h(x) is either strictly increasing or strictly decreasing
for x with fX (x) > 0, then the random variable U = h(X)
has density




1 (u)

d
h


1
fU (u) = fX h (u)
.


du

Functions of Random Variables

p. 18/31

The Method of Transformations


Example 7.10:
Suppose that the random variable X has an exponential
distribution with mean .
Find the probability density of
the random variable U = X

Solution:
 1 
d h (u)

1
2
= 2u
h(u) = u so h (u) = u and
du




1
 d h (u)
2u u2 /
1
fU (u) = fX h (u)
e
, u>0
=


du

Functions of Random Variables

p. 19/31

The Method of Conditioning


If U = h(X1 , X2 ), how to use the transformation method
to find fU (u)?
Given X1 = x1 , U = h(X1 , X2 ) is a function only of X2 .
For the time being think of it as h? (X2 ).
The conditional density of X2 is f2 (x2 |x1 ).

Use the transformation method on f2 (x2 |x1 ) where h? is


the transformation to use. This gives a a conditional
density fU (u|X1 = x1 ) for each x1 .


Compute the expectation E fU (u|X1 ) with respect to
the density of X1 , that is,
Z


fU (u) = E fU (u|X1 ) =
fU (u|x1 )f1 (x1 )dx1 .

Functions of Random Variables

p. 20/31

The Method of Conditioning


Why the proposed idea works?


P (U u) = P h(X1 , X2 ) u =
P h(x1 , X2 ) u|X1 = y1 fX1 (x1 ) d


Z Z u
=
fU (t|X1 = x1 ) dt fX1 (x1 ) dx1

Hence,

Z

0
fU (t|X1 = x1 ) dt fX1 (x1 ) dx1

d P (U u)
=
du

Z
=
fU (u|X1 = x1 )fX1 (x1 ) dx1
Z

fU (u) =

Functions of Random Variables

p. 21/31

The Method of Conditioning


Example 7.11: Example 7.6 revisited
Suppose that X1 and X2 are two independent
exponential random variables, each with the same
parameter . Let U = X1 + X2 . Find the probability
mass function of U .

Solution:

 1 
d h (u)
?
1
=1
h (u) = x1 + u so h (u) = u x1 and
du




1
 d h (u)
1
fU (u|X1 = x1 ) = fX2 h (u)
=


du
1 (ux1 )/
e
for 0 < x1 < u

Z u
1 (ux1 )/ 1 x1 /
1 21 u/
fU (u) =
e
e
dx1 = 2 u e

0
Functions of Random Variables
p. 22/31

Joke

Ive been drinking a lot, and a friend of


mine suggested that I go to an alcohol
therapy session organized by a local
activist. When I got there they had a
pamphlet near the door that listed the
ten sign that you may have a drinking
problem. One of the items on the
pamphlet said, Does drinking affect
your family? Im thinking Hell, drinking
started my family.
Functions of Random Variables

p. 23/31

The Method of Moments


The moment generating function
defined for a
 is

random variable X as m(t) = E etX .

Recall that there is a one-to-one correspondence


between moment generating functions and distributions
functions.
If two random variables U and V have moment
generating functions mU (t) and mV (t) satisfying
mU (t) = mV (t) for all values of t, then U and V have the
same probability distribution.
Let U = h(X1 , X2 , , Xn ). In practice we compute the
moment generating function of U and try to match it
with the moment generating function of one of the
well-known distributions weve discussed before.
Functions of Random Variables

p. 24/31

The Method of Moments


Example 7.12:
Suppose that X Beta(, 1) and U = ln(X). Find the
distribution function of U .

Solution:
mU (t) = E(etU ) = E(et ln(X) ) = E(X t )
Z 1
1
B( t, 1)
t 1
=
x x
dx =
B(, 1) 0
B(, 1)

1
=
=
t
1 t/

Hence, U Exp(1/).
Functions of Random Variables

p. 25/31

The Method of Moments


The method of moment find is best applications with the
sum of n independent random variables X1 , X2 , , Xn ,
that is, U = X1 + X2 + + Xn . In this case
mU (t) = mX1 (t)mX2 (t) mXn (t)

Example 7.13:
Suppose that X1 , X2 , , Xn are independent and
identically distributed as an exponential distribution with
mean , then find the distribution function of
U = X1 + X2 + + Xn .

Solution:

mU (t) =

1
1
1
1

=
1 t 1 t
1 t
(1 t)n

Functions of Random Variables

Hence, U Gam( = n, ).

p. 26/31

The Method of Moments


Example 7.14:
Suppose that X1 , X2 , , Xn are independent such that
Xi Gam(i , ), i = 1, 2, , n, then find the
distribution function of U = X1 + X2 + + Xn .

Solution:
mU (t) =

1
1
1
1

=
(1 t)1 (1 t)2
(1 t)n
(1 t)1 +2 ++n

Hence, U Gam( = 1 + 2 + + n , ).

Example 7.15: Example 7.8 revisited


Solution:
t

mU (t) = pe + q

n

pe + q

m

= pe + q

n+m

Functions of Random Variables

Hence, U Binomial(n + m, p).

p. 27/31

The Method of Moments


Theorem 7.2:
Let X1 , X2 , , Xn be independent normally distributed random variables with E(Xi ) = i and Var(Xi ) = 2 , for i = 1, 2, , n, and let
a1 , a2 , , an be constants. If
U=

n
X
i=1

ai Xi = a1 X1 + a2 X2 + + an Xn ,

then U is a normally distributed random variable with mean


E(U ) =

n
X

ai i = a1 1 + a2 2 + + an n ,

n
X

a2i i2 = a21 12 + a22 22 + + a2n n2 .

i=1

and variance
Var(U ) =

i=1

Functions of Random Variables

p. 28/31

The Method of Moments


The proof is as follows.
The moment generating function of Xi is given by


2
t 2
MXi (t) = exp ti + i .
2
Hence the moment generating function of U is

MU (t) = MX1 (a1 t)MX2 (a2 t) MXn (an t)






2
2
(a1 t) 2
(a2 t) 2
1 exp (a2 t)2 +
2
= exp (a1 t)1 +
2
2


2
(an t) 2
exp (an t)n +
n
2
#
" n
n
X
t2 X 2 2
ai i
= exp t
ai i +
2
i=1
i=1
Functions of Random Variables

p. 29/31

The Method of Moments

Theorem 7.3:
Let X1 , X2 , , Xn be independent normally distributed random variables with E(Xi ) = i and Var(Xi ) = 2 , for i = 1, 2, , n, and
define
X i i
Zi =
, i = 1, 2, , n .
i
Then

n
X

Zi2 has a 2 distribution with n degrees of freedom, that is,

i=1

n
a Gamma distribution with parameters = and = 2.
2

Functions of Random Variables

p. 30/31

Joke

I try too hard to be politically


correct. Whenever I fill out an
application for a credit card,
under marital status, I write
pre-owned.

Functions of Random Variables

p. 31/31

Anda mungkin juga menyukai