Anda di halaman 1dari 12

Consider a group of people.

Assume that their birthdays are independent and there are 365

days in a year. That is, their birthdays can be viewed as independent discrete uniform random

variables with parameters 1 and 365. Let A be the event that there are at least two people in a

group with the same birthday. Let PN (A) be the probability of the event A occurs given that

the number of people in the group is N. Write a recursive formula to derive PN (A). Obtain

the values of P2(A), P5(A) and P10(A). Find the smallest values of N, such that PN (A) ≥ 0.5

and PN (A) ≥ 0.999, respectively.

Guide and answers

Notice the following recursive relationship for the probability of not having two people with

the same birthday among N people:

1 − PN (A) =

1−

N−1

365

(1 − PN−1(A)), for N = 2, 3, 4, . . . , 365,

with P1(A) = 0, and PN (A) = 1 for N ≥ 365.


P2(A) = 0.00274, P5(A) = 0.027 and P10(A) = 0.117

For N = 23, P23(A) = 0.5073.

For N = 70, P70(A) = 0.99916.

Are you surprised?

1.9 Continuous Random Variables and their Distributions

Continuous random variables are related to cases whereby the set of possible outcomes is

uncountable. A continuous random variable X is a function that assigns a real number to

outcome of an experiment, and is characterized by the existence of a function f(·) defined for

all x ∈ R, which has the property that for any set A ⊂R,

P(X ∈ A) = Z

f(x)dx. (36)

Such function is the probability density function (or simply the density) of X. Since the con-

tinuous random variable X must take a value in R with probability 1, f must satisfy,

Z +∞

−∞
f(x)dx = 1. (37)

If we consider Eq. (36), letting A = [a, b], we obtain,

P(a ≤ X ≤ b) = Z b

f(x)dx. (38)

An interesting point to notice is that the probability of a continuous random variable taking a

particular value is equal to zero. If we set a = b in Eq. (38), we obtain

P(X = a) = Z a

f(x)dx = 0.

As a result, for a continuous random variable X, the cumulative distribution function FX(x) is

equal to both P(X ≤ x) and to P(X < x). Similarly, the complementary distribution function

is equal to both P(X ≥ x) and to P(X > x).

By Eq. (38), we obtain


FX(x) = P(X ≤ x) = Z x

−∞

f(s)ds. (40)

Hence, the probability density function is the derivative of the distribution function.

An important concept which gives rise to a continuous version of the Law of Total Probability

is the continuous equivalence of Eq. (15), namely, the joint distribution of continuous random

variables. Let X and Y be two continuous random variables. The joint density of X and Y

denoted fX,Y (x, y) is a nonnegative function that satisfies

P({X, Y } ∈ A) = Z Z

{X,Y }∈A

fX,Y (x, y)dxdy (41)

for any set A ⊂R2

The continuous equivalence of the first equality in (23) is:

fY (y) = Z ∞

−∞
fX,Y (x, y)dx. (42)

Another important concept is the conditional density of one continuous random variable on

another. Let X and Y be two continuous random variables with joint density fX,Y (x, y). For

any y, such that the density of Y takes a positive value at Y = y (i.e. such that fY (y) > 0),

the conditional density of X given Y is defined as

fX|Y (x | y) = fX,Y (x, y)

fY (y)

. (43)

For every given fixed y, it is a legitimate density because

Z∞

−∞

fX|Y (x | y)dx =

Z∞

−∞

fX,Y (x, y)dx

fY (y)
=

fY (y)

fY (y)

= 1. (44)

Notice the equivalence between the conditional probability (1) and the conditional density (43).

By (43)

fX,Y (x, y) = fY (y)fX|Y (x | y) (45)

so

fX(x) = Z ∞

−∞

fX,Y (x, y)dy =

Z∞

−∞

fY (y)fX|Y (x | y)dy. (46)

Recall again that fX,Y (x, y) is defined only for y values such that fY (y) > 0.

Let define event A as the event {X ∈ A} for A ⊂R. Thus,


P(A) = P(X ∈ A) = Z

fX(x)dx =

Z∞

−∞

fY (y)fX|Y (x | y)dydx. (47)

Hence,

P(A) = Z ∞

−∞

fY (y)

fX|Y (x | y)dxdy
and therefore

P(A) = Z ∞

−∞

fY (y)P(A | Y = y)dy (49)

which is the continuous equivalence of the Law of Total Probability (7).

Homework 1.13

I always use two trains to go to work. After traveling on my first train and some walking in

the interchange station, I arrive at the platform (in the interchange station) at a time which

is uniformly distributed between 9.01 and 9.04 AM. My second train arrives at the platform

(in interchange station) exactly at times 9.02 and 9.04 AM. Derive the density of my waiting

time at the interchange station. Ignore any queueing effects, and assume that I never incur any

queueing delay.

Guide:

Let D be a random variable representing the delay in minutes. Plot D as a function of the

arrival time.

Then, derive the complementary distribution of D to obtain:


P(D > t) =

1−

2t

0 ≤ t ≤ 1,

2−t

1 < t < 2,

0 otherwise.

Finally, obtain the density from this complementary distribution function.

We will now discuss the concept of convolution as applied to continuous random variables. Con-

sider independent random variables U and V that have densities fU (u) and fV (v), respectively,

and their sum which is another random variable X = U + V . Let us now derive the density
fX(x) of X.

fX(x) = P(U + V = x)

f(U = u, V = x − u)du

fU (u)fV (x − u)du. (50)

The latter is the convolution of the densities fU (u) and fV (v).

As in the discrete case the convolution fY (y), of k densities fXi

(xi), i = 1, 2, 3, . . . , k, of

random variables Xi

, i = 1, 2, 3, . . . , k, respectively, is given by

fY (y) = Z Z
x2, ..., xk: x2+ ...,+xk≤y

fX1

(y − Σ

i=2xi)

i=2

fXi

(xi)

. (51)

And again, in the special case where all the random variables Xi

, i = 1, 2, 3, . . . , k, are IID,

the density fY is the k-fold convolution of fX1

Anda mungkin juga menyukai