Anda di halaman 1dari 2

3.

8 Combining Random Variables


pdf: PX(x) = nCxpx(1-p)n-x Abbreviation: X~bin(n,p)
Suppose that X and Y are independent random variables.  = E(X) = np Var(X) = np(1-p)
Let W=X+Y. Then, MX(t) = (1-p+pet)n

PW(w) = all X PX(x)PY(w-x) ,if discrete When to use: considering number of successes in n trials

fW(w) = -fX(x)fY(w-x)dx ,if continuous Poisson Distribution

Let W=Y/X. Then, pdf: PX(k) = P(X = k) = e-k / k! Abbreviation: X ~ poi()


 = E(X) = Var(X) =  MX(t) = e-+e^t
fW(w) = |x|fX(x)fY(wx)dx
When to use: considering number of outcomes in a given
Let W=XY interval.

fW(w) = 1/|x| * fX(w/x)fY(x)dx Note: nCkpk(1-p)n-k  e-np(np)k / k!

Note: the limits of integration and summation are defined Exponential Distribution
by the domain/range of x and y.
fY(y) = e-y , y > 0 Abbreviation: Y  exp()
3.11 Conditional Densities  = 1/ Var(Y) = 1/2
MY(t) = /-t ,t-<0
PY|x(y) = P(Y=y | X=x) = PX,Y(x,y) / PX(x)
When to use: considering time in between 2 consecutive
PY|x(y) and PX|y(x) both represent valid pdfs events

If X and Y are independent, note that Normal Distribution


PY|x(y) = PX(x)
PX|y(x) = PY(y) Standard Normal Distribution:

cdf: P(Z  t) = (t) = (1/2) * -t e-u^2/2
3.12 Moment-Generating Functions 
pdf: fz(z) = 1/2 e-z2/2

The moment-generating function for X is denoted by MZ(t) = et^2 / 2

symmetric about the y-axis

MX(t) = E(etx) (t) is the area under f from - to t

() = 1 and (0) = ½
The rth moment of W about the origin, r 
(-t) = 1 - (t)
r = E(Wr) 
P(aZb) = (b) - (a)

E(Z) = 0 and 2 = 1
MW(r)(0) = E(Wr)
Normal Distribution
E(Xr) is the coefficient of tr/r! in the Maclaurin series 1
−1 x−μ 2
( )
expansion of MX(t)  fX (x)= e2 σ

2μσ
 Uniqueness Theorem – If the mgf of a variable is equal  X  N(,2)
to the mgf of a known random variable for a given  Standardization: If XN(,2), then Z = X- / 
interval, then the random variables have equal pdfs.  De Moivre-Laplace Theorem: Let X be a binomial random
€ variable defined on n independent trials for which
 Let W be a random variable with mgf MW(t). Let V = p=P(success). For any numbers a and b,
aW+b. Then X −np 1 b − 2 /2
P(a ≤
lim ≤b)= ∫a e t
MV(t) = ebtMW(at) n →∞ np(1−p) 2π

Let W1, W2,...,Wn be independent random variables
with mgf’s Mw1(t), Mw2(t),...,Mwn(t) respectively. Let
W=W1+W2+...+Wn. Then, use only when
€ 9p 9(1 − p)
Mw(t) = i=1n MWi(t)
n> or n>
1− p p
Special Distributions
 Correction for Continuity
Binomial Random Variable € €
Let X be a discrete random variable with pdf p(x). To λr
approximate the probability that X is within a given fY (y)= yr−1e−λy ,y > 0
interval, we need to make the ff. corrections for (r−1)!
continuity:
j+1/2 1 λ
P(i≤X ≤ j)≈ ∫i−1/2 f(x)dx M Y (t)= ( )r = ( )r
t λ −t
k+1/2 € 1−
P(X = k)≈ ∫k−1/2 f(x)dx λ
r

P(X ≥i)≈ ∫i−1/2 f(x)dx E(Y)=
€ λ
P(X ≤ j)≈ ∫∞
j+1/2
f(x)dx r
€ Var(Y)=
Central Limit Theorem λ2

Let X1,X2,... be a sequence of iid random variables, each Abbreviation: Y  gamma(r,)
with expectation  and variance 2. Then

X1 + X2 +...
+ Xn −nμ €  The sum of r iid exp() is gamma(r,)
Zn =  The sum of independent gamma(r,) and gamma(s,)
σ n
is gamma(r+s,)
converges to the distribution of a standard normal
 When to use: considering a set of exponential random
random variable.
variables; looking for the probability of the rth
€ occurrence.
Remark: Dividing the numerator and denominator by
n, we can deduce from the CLT that
X −μ
Zn =
σ/ n
converges to the distribution of a standard normal
distribution.
σ 2t2
€ μt+
 M Y (t)= e 2

 Let Y1N(1,12) and Y2N(2,22). Y=Y1+Y2. If Y1 and Y2


are independent, YN(1+2,12+22).
n
1 σ2
€  Y= ∑i Y ~ N (
μ , )
n i=1 n

n n
Y = a1Y1 + a2Y2 +...
+ anYn ~ N (∑aiμi,∑a2iσ 2i )

i=1 i=1
ex. X~N(1,2) and Y~N(2,4)

2X+3Y ~ N( 2(1) + 3(2), 22(2) + 32(4) )

Gamma Distribution

Anda mungkin juga menyukai