Anda di halaman 1dari 10

Expectation of a random variable

One of the key ways to describe a random variable is using its expectation (or expected value). Essentially, the expected value is the
result of taking many realizations of the random variable, and then
averaging the results.
B&T has a nice explanation. Suppose you are playing a game where
you are spinning a wheel; there are four possible outcomes, k =
1, 2, 3, 4, with payoffs of
$100 if k = 1, $50 if k = 2, $25 if k = 3, $0 if k = 4.
For every spin, the probability that the wheel has outcome k is pk .
Now, how much money can you expect to make per spin?
If we spun the wheel N times, the total amount of money you make
is 100 n1 + 50 n2 + 25 n3 + 0 n4, where nk is the number of times
the wheel landed on k. We could then compute the average amount
earned on each spin as
100 n1 + 50 n2 + 25 n3 + 0 n4
.
M=
K
As K gets large, we expect that
n1
n2
n3
n4
P (k = 1) ,
P (k = 2) ,
P (k = 3) ,
P (k = 4) ,
K
K
K
K
(after all, what do probabilities mean if not this?) This motivates
the definition of the expected payout as
E[X] = 100 p1 + 50 p2 + 25 p3 + 0 p4.
Definition: The expectation of a random variable X with pmf
pX (x) is
X
x pX (x).
E[X] =
x

12
ECE 3077 Notes by J. Romberg

Important Point:
The expectation of X is not random, it is a completely deterministic function of the pmf of X.

Example: Suppose that X has the pmf


(

pX (x) =

x/10 for x = 1, 2, 3, 4
0
otherwise.

Calculate E[X].

13
ECE 3077 Notes by J. Romberg

Example Suppose that X is a Poisson random variable with parameter , so

pX (k) = e

k
k!

k = 0, 1, 2, . . . .

Then
E[X] =

ke

k=0

=e

=e

k
k!

k
k
k!
k=1
k
(k 1)!

k=1

= e

k=1

k1
(k 1)!
0

= e
k0!
k 0 =0
= ee
= .

14
ECE 3077 Notes by J. Romberg

Example (B&T, p. 91)


Consider a quiz game where a person is given two questions and must
decide which one to answer first:
Question 1 will be answered correctly with probability 0.8 and
the person will then receive a prize of $100,
Question 2 will be answered correctly with probability 0.5 and
carries a prize of $200.
If the first question is answered correctly, you are allowed to
attempt the second question.
If the first question is answered incorrectly, the quiz terminates.
Which question should be answered first to maximize the expected
winnings?

15
ECE 3077 Notes by J. Romberg

It is possible that the pmf of a random variable is well-defined, but


the expectation is unbounded. For example, say X has pmf
6 1
, k = 1, 2, . . .
2 k2
P
This is a proper pmf, since it is a fact that k1 1/k 2 = 2/6. But
pX (k) =

6 X
1
6 X
1
E[X] = 2
k 2= 2
= .
k=1 k
k=1 k

Expectations of functions of a random variable


It is straightforward to define the expectation of a random variable
g(X):
X
E[g(X)] =
g(x)pX (x)
x

Example: Suppose (as above) that X is a discrete random variable


with pmf
(

pX (x) =

1/9, when x is an integer with 4 x 4


.
0,
otherwise

1. Let Y = g(X), where g(x) = |x|. Compute E[Y ].

2. In this case, does E[g(X)] = g(E[X])?


(That is, does E[|X|] = | E[X]|?)

16
ECE 3077 Notes by J. Romberg

It is important to note that in general:


E[g(X)] 6= g(E[X]).
I cannot stress this enough. Making these two things equal is one of
the most common mistakes that probability students make.

Example (B&T, p. 87): If the weather is good, which happens


with probability 0.6, Alice walks the 2 miles to class at a speed of
V = 5 miles per hour; otherwise, she rides her motorcycle with a
speed of V = 30 miles per hour. What is the expected value of the
time T it takes her to get to class?

17
ECE 3077 Notes by J. Romberg

Variance
After the mean, the most important quantity associated with a random variable is its variance:
var(X) = E[(X E[X])2].
Notice that since (X E[X])2 0, the variance is always nonnegative:
var(X) 0
Related to the variance is the standard deviation:
q

var(X)

X =

The variance and the standard deviation are measures of the dispersion of X around its mean. We will use both, but X is often easier
to interpret since it has the same units as X. (For example, if X is
in feet, the var(X) is in feet2 while X is also in feet.)
Example: Let X be a Bernoulli random variable, with
(

pX (k) =

1p k =0
.
p
k=1

Then
E[X] = p,

var(X) = E[(X p)2] = p(1 p)

18
ECE 3077 Notes by J. Romberg

19
ECE 3077 Notes by J. Romberg

20
ECE 3077 Notes by J. Romberg

Properties of mean and variance


Below, X is a random variable, and a, b R are constants.
1. E[X + b] = E[X] + b
2. E[aX] = a E[X]
3. We can collect the two results above into one statement:
E[aX + b] = a E[X] + b.
So if g(X) has the form g(x) = ax + b, the we actually do have
E[g(X)] = g(E[X]) but again, this is not true for general
g(x).
4. var(X) = E[X 2] (E[X])2.
It is easy to see why this is true:
var(X) = E[(X E[X])2] = E[X 2] 2 E[X E[X]] + E[(E[X])2]
= E[X 2] (E[X])2,
where we have used the fact that since E[X] is not random at
all, E[E[X]] = E[X], etc.
5. var(X + b) = var(X) + b.
(You can prove that at home.)
6. var(aX) = a2 var X

21
ECE 3077 Notes by J. Romberg

Anda mungkin juga menyukai