Anda di halaman 1dari 8

Conditional Probability

Conditional probability: for events E and F : P (E | F ) = P (EF ) P (F )

Conditional probability mass function (pmf) pX|Y (x | y) = P {X = x | Y = y} P {X = x, Y = y} = P {Y = y} p(x, y) = pY (y) dened for y : pY (y) > 0. Conditional expectation of X given Y = y E[X | Y = y] =
x

xpX|Y (x | y)

If X and Y are independent, then E[X | Y = y] = E[X].

Examples
1. Suppose the joint pmf of X and Y is given by p(1, 1) = 0.5, p(1, 2) = 0.1, p(2, 1) = 0.1, p(2, 2) = 0.3. Find the pmf of X given Y = 1. Solution: pX|Y =1(1) = p(1, 1)/pY (1) = 0.5/0.6 = 5/6 pX|Y =1(2) = p(2, 1)/pY (1) = 0.1/0.6 = 1/6 2. If X and Y are independent Poisson RVs with respective means 1 and 2, nd the conditional pmf of X given X + Y = n and the conditional expected value of X given X + Y = n. Solution: Let Z = X + Y . We want to nd pX|Z=n (k). For k = 0, 1, 2, ..., n pX|Z=n(k) = = = = P (X = k, Z = n) P (Z = n) P (X = k, X + Y = n) P (Z = n) P (X = k, Y = n k) P (Z = n) P (X = k)P (Y = n k) P (Z = n)
2

We know that Z is Poisson with mean 1 + 2. P (X = k, Z = n) pX|Z=n(k) = P (Z = n) P (X = k)P (Y = n k) = P (Z = n) = = e


1

k 1 k!

e(1+2) n k

nk 2 (nk)! (1 +2 )n n! k 1

1 + 2

2 1 + 2

nk

Hence the conditional distribution of X given X + Y = n is a binomial distribution with parameters n and 11 2 . + 1n . E(X|X + Y = n) = 1 + 2 3. Consider n + m independent trials, each of which results in a success with probability p. Compute the expected number of successes in the rst n trials given that there are k successes in all. Solution: Let Y be the number of successes in n + m trials. Let X be the number of successes in the rst n trials. Dene 1 if the ith trial is a success Xi = 0 otherwise
3

The X =

n i=1 Xi . n n

E(X|Y = k) = E(
i=1

Xi|Y = k) =
i=1

E(Xi|Y = k)

Since the trials are independent Xi|Y = k have the same distribution. Hence E(Xi|Y = k) = P (Xi = 1|Y = k) = P (Xi = 1|Y = k) P (X1 = 1, Y = k) P (X1 = 1|Y = k) = P (Y = k) n+m1 pk (1 p)n+mk k1 = n+m pk (1 p)n+mk k k = n+m Hence nk E(X|Y = k) = n+m

Conditional Density
Conditional probability density function: fX|Y (x | y) = dened for y : fY (y) > 0. P (X R | Y = y) =
R fX|Y (x

f (x, y) fY (y) | y)dx

Conditional expectation of X given Y = y

E[X | Y = y] =

xfX|Y (x | y)dx

For function g, the conditional expectation of g(X)

E[g(X) | Y = y] =

g(x)fX|Y (x | y)dx

Computing Expectation by Conditioning


Discrete: E[X] =
y

E[X | Y = y]pY (y) xpX|Y (x | y)pY (y)


y x

= Continuous:

E[X] =

E[X | Y = y]fY (y)dy

xfX|Y (x | y)fY (y)dxdy

Chain expansion: E[X] = EY [EX|Y (X | Y )].

Expectation of the sum of a random number of random variables: If X = N Xi, N is a random variable independent i=1 of Xis. Xis have common mean . Then E[X] = E[N ]. Example: Suppose that the expected number of accidents per week at an industrial plant is four. Suppose also that the numbers of workers injured in each accident are independent random variables with a common mean of 2. Assume also that the number of workers injured in each accident is independent of the number of accidents that occur. What is the expected number of injuries during a week? The variance of a random number of random variables: Z = N Xi, E(Xi) = , V ar(Xi) = 2, i=1 V ar(Z) = 2E[N ] + 2V ar(N ).

Computing Probability by Conditioning


Total probability formula: Suppose F1, F2, ..., Fn are mutually exclusive and n Fi = S i=1
n

P (E) =
i=1

P (Fi)P (E | Fi)

pX (x) = fX (x) =

yi pX|Y (x | yi )pY (yi ). fX|Y (x | y)fY (y)dy.

Anda mungkin juga menyukai