Assume that their birthdays are independent and there are 365
days in a year. That is, their birthdays can be viewed as independent discrete uniform random
variables with parameters 1 and 365. Let A be the event that there are at least two people in a
group with the same birthday. Let PN (A) be the probability of the event A occurs given that
the number of people in the group is N. Write a recursive formula to derive PN (A). Obtain
the values of P2(A), P5(A) and P10(A). Find the smallest values of N, such that PN (A) ≥ 0.5
Notice the following recursive relationship for the probability of not having two people with
1 − PN (A) =
1−
N−1
365
Continuous random variables are related to cases whereby the set of possible outcomes is
outcome of an experiment, and is characterized by the existence of a function f(·) defined for
all x ∈ R, which has the property that for any set A ⊂R,
P(X ∈ A) = Z
f(x)dx. (36)
Such function is the probability density function (or simply the density) of X. Since the con-
tinuous random variable X must take a value in R with probability 1, f must satisfy,
Z +∞
−∞
f(x)dx = 1. (37)
P(a ≤ X ≤ b) = Z b
f(x)dx. (38)
An interesting point to notice is that the probability of a continuous random variable taking a
P(X = a) = Z a
f(x)dx = 0.
As a result, for a continuous random variable X, the cumulative distribution function FX(x) is
equal to both P(X ≤ x) and to P(X < x). Similarly, the complementary distribution function
−∞
f(s)ds. (40)
Hence, the probability density function is the derivative of the distribution function.
An important concept which gives rise to a continuous version of the Law of Total Probability
is the continuous equivalence of Eq. (15), namely, the joint distribution of continuous random
variables. Let X and Y be two continuous random variables. The joint density of X and Y
P({X, Y } ∈ A) = Z Z
{X,Y }∈A
fY (y) = Z ∞
−∞
fX,Y (x, y)dx. (42)
Another important concept is the conditional density of one continuous random variable on
another. Let X and Y be two continuous random variables with joint density fX,Y (x, y). For
any y, such that the density of Y takes a positive value at Y = y (i.e. such that fY (y) > 0),
fY (y)
. (43)
Z∞
−∞
fX|Y (x | y)dx =
Z∞
−∞
fY (y)
=
fY (y)
fY (y)
= 1. (44)
Notice the equivalence between the conditional probability (1) and the conditional density (43).
By (43)
so
fX(x) = Z ∞
−∞
Z∞
−∞
Recall again that fX,Y (x, y) is defined only for y values such that fY (y) > 0.
fX(x)dx =
Z∞
−∞
Hence,
P(A) = Z ∞
−∞
fY (y)
fX|Y (x | y)dxdy
and therefore
P(A) = Z ∞
−∞
Homework 1.13
I always use two trains to go to work. After traveling on my first train and some walking in
the interchange station, I arrive at the platform (in the interchange station) at a time which
is uniformly distributed between 9.01 and 9.04 AM. My second train arrives at the platform
(in interchange station) exactly at times 9.02 and 9.04 AM. Derive the density of my waiting
time at the interchange station. Ignore any queueing effects, and assume that I never incur any
queueing delay.
Guide:
Let D be a random variable representing the delay in minutes. Plot D as a function of the
arrival time.
1−
2t
0 ≤ t ≤ 1,
2−t
1 < t < 2,
0 otherwise.
We will now discuss the concept of convolution as applied to continuous random variables. Con-
sider independent random variables U and V that have densities fU (u) and fV (v), respectively,
and their sum which is another random variable X = U + V . Let us now derive the density
fX(x) of X.
fX(x) = P(U + V = x)
f(U = u, V = x − u)du
(xi), i = 1, 2, 3, . . . , k, of
random variables Xi
, i = 1, 2, 3, . . . , k, respectively, is given by
fY (y) = Z Z
x2, ..., xk: x2+ ...,+xk≤y
fX1
(y − Σ
i=2xi)
i=2
fXi
(xi)
. (51)
And again, in the special case where all the random variables Xi
, i = 1, 2, 3, . . . , k, are IID,