Anda di halaman 1dari 6

Idea: mean value of a random

variable
Definition: Weighted mean of the values of
the random variable

Mathematical Expectation
of a Random Variable

E ( X ) = xP ( X = x )
x X

Condition for the existence:


E ( X ) = x P ( X = x) <
xX

Note: Infinite sets can yield paradoxes


1

Idea: mean value of a random


variable

Sum of random variables

Definition: Weighted mean of the values of


the random variable
E(X ) =

Example: sum of points of n dice

xP( X = x )
x X

f p ( p )

p * = P ( w h i t e o b s e r v a t i o n/ c o m p o s i t i o n o f t h e u r n)

E ( X ) = xP ( X = x )

p*
Urn:
3 White
7 Black

x X

P (c o m p o s i t i o n o f t h e u r n / whiteobservation)
Observation

http://mathworld.wolfram.com/Dice.html

Idea: mean value of a random


variable

Mistakes of intuition
Intuition corresponds to ratio.
Convergence on ratio.
Difference gets as bigger !

favorable
1

( favorable + unfavorable)
2

Note that it is probabilistic interpretation of the


common sense concept of mean.

Two samples of the height of a population with a real mean of 18 0 cm.

favorable unfavorable

Mean( X ) =

1
N

x
x X
25

25

20

20

15

15

10

10

1 3 0

1 4 0

150

160

1 7 0

180

190

2 0 0

210

2 2 0

0
120

140

1 6 0

180

140

1 6 0

180

200

220

2 4 0

200

220

2 4 0

100
100

90

90

80
80

70
70

60
60

50
50

40

40

30

30

20

20

10
0
130

10

140

150

160

170

180

190

200

210

220

230

0
120

Idea: mean value of a random


variable

Idea: center of mass/equilibrium

Note that it is probabilistic interpretation of the


common sense concept of mean.
Note that if all possible samples are equally
probable, they are equivalent.
E ( X ) = xP ( X = x )

1
N

P( X = x) =

x X

Mean ( X ) =

1
N

If the probability P(X=x) is interpreted as


mass, and the random variable X as
distance, the mathematical expectation is
the center of mass of the object.

E ( X ) = xP ( X = x )

xX

x X

2 5

2 5

2 0

2 0

1 5

1 5

1 0

1 0
5

0
1

1 4 0

1 6 0

1 7 0

1 0 0

1 9 0

2 2 0

1
0

1 4 0

1 6 0

1 8 0

9 0
1

2 2 0

2 4 0

8 0
9 0

7 0

8 0

6 0

7 0

5 0

6 0
5 0

4 0
4 0

3 0
3 0

2 0

2 0

1 0

1 0

0
1

1 4 0

1 5 0

1 7 0

1 8 0

2 0 0

2 1 0

2 3 0

1 2 0
0

1 4 0

1 8 0

2 0 0

2 4 0

Example: Roulette

Example: Roulette

What is the payoff given by the casino?


Roulette selects at random a number between 1 and 36
Players can bet on 18,12,9,6,4,3,2,1 number.

(0) .

A Bet over k numbers has a probability of success of k/36 and


of getting a payment x from the casino or lossing a.
Expected value is

k
k

E( X ) = xP ( X = x ) =
x + 1 ( a)
36 36
x{ Win, Loss}
A fair game would imply

What is the payoff given by the casino?


The roulette has 37 slots (1 to 36+ the 0 slot)
In a Real game the casino the odds are
35:1

17:2

11:3

xP( X = x) = k

36

E( X ) =

x{Win ,Loss }

k
a
k

1a + ( a )1 =
37
37
37

E( X ) < 0

E( X ) = 0

We need to model the variability

36
x = 1 a
k

10

Example: Parking ticket

Example: Parking ticket

Which is the expected value of not paying


the parking ticket?

Which is the expected value of not paying


the parking ticket?
Important issues:

Parking ticket
Parking fine
Prob. of getting caught

3 euros
50 euros
0.05

Subjective value:
E( X ) =

3 euros vs. 150 euros

xP( X = x) = 3* 0.95- 50* 0.05= 0.35

x{ Win, Loss}

E( X ) =

xP ( X = x ) = 3 *0.95 - 50* 0.05 = 0.35

Variability of the savings


Possible runs of fines

x{ Win , Loss}

11

12

Expected value of a geometric


random variable

Expected value of a geometric


random variable

Random Varible X={Number of trials until


a success}

How do we sum this series?

P( X = i ) = pi 1 (1 p) for i = 1,2,3,L

E( X ) = npn1(1 p) = (1 p) 1+ 2 p + 3 p2 + 4 p3 + L
n =0

How do we sum this series?

13

Expected value of a geometric


random variable

14

Expected value of a geometric


random variable

How do we sum this series?

How do we sum this series?

E( X ) = np

n1

(1 p) = (1 p) 1 + 2 p + 3 p + 4 p + L

n= 0

pE( X ) = np n (1 p ) = (1 p ) p + 2 p 2 + 3 p3 + 4 p 4 + L
n =1

E( X ) pE( X ) = (1 p) 1 + p + p + p + p + L =
2

1
= (1 p ) p
= (1 p )
=1
Geometric
(1 p )
k =0
Series
k

E( X ) =

1
(1 p)

15

Expected value of a geometric


random variable

Expected value of a geometric


random variable

How do we sum this series? Another way


p
(1 p)

P( X = i ) = pi 1 (1 p) for i = 1,2,3,L

0.1

E(X ) =

1
1 p

E(X) =

0.09
0.08
0.07
0.06

1
1
=
1 p 9

0.05
0.04
0.03
0.02
0.01

18
61

56

51

46

41

0
36

17

31

d
d
a ( x) a ( x) b( x )
dx
dx
2
b( x )

n= 0

26

d a ( x)

=
dx b( x )

b( x)

1
(1 p) 2

21

E( X ) = (1 p)

E ( X ) = np n1(1 p ) = (1 p ) 1 + 2 p + 3 p 2 + 4 p3 + L

d
d p
1
S ( p ) = kp k 1 =
=
dp
dp (1 p) (1 p )2
k =1

Geometric
Series

16

k =1

Random Varible X={Number of trials until


a success}

11

S (p ) = p

16

Relation of expectaition and other


statistical measures

Property of lineality
The expectation of a linear combination of random
variables is the linear combination of expectations.

Skew distribution vs. symetric distribution


Mode Median

Mode Median

Mean

Mean

E (X + Y ) = E( X ) + E(Y )
0.8

0.6

E (X + Y ) = (X ( )+ Y ( ))P( )

0.4

2
0.2

1
0
-8

-6

-4

-2

0.5

1.5

2.5

3.5

4.5

= X ( )P () + Y ()P( ) =

Full House: The Spread of Excellence from


Plato to Darwin by Stephen Jay Gould

19

Finding the maximum

= E( X ) + E(Y )

20

Finding the maximum

Suppose that n children of differing heights are


placed in line at random. You select the first
child from the line and walk with her/him along
the line until you encounter a child who is taller
or until you have reached the end of the line. If
you do encounter a taller child, you repeat the
process.
What is the expected value of the number of
children selected from the line?

We define the variable X as the number of


children selected from the line.
We will define the indicator variable:
1 if the i-th child is selected from the line
Xi =
0 otherwise

Now the number of selected children will


be:
X = X 1 + X 2 + L+ X n

Taken fromTijms, Understanding probability

21

22

Expected number of distinct


birthdays

Finding the maximum


The probability that the ith child is the
tallest among the first i children is 1/i
Therefore:

What is the expected number of distinct


birthdays within a randomly formed group
of 100 persons.

1 1
1
E (X i ) = 0 1 + 1 =
for i = 1,2,L n
i i
i
E (X ) = E( X1 + X 2 + L + X n )

We define the random variable

1 1
1
1
E (X ) = 1 + + L + ln(n) + + 0.57722
2 3
n
2n

The number of birthdays is X = X1 + X 2 + L + X 3 6 5

1 if the birthday is on day i


Xi =
0 otherwise

23

Taken fromTijms, Understanding probability

24

Expected number of distinct


birthdays

Expected number of distinct


birthdays

What is the expected number of distinct birthdays within


a randomly formed group of 100 persons.

For an arbitrary number of persons.


364 n

E ( X ) = 365 1
365

For each day we have:


100

50

350

45

300

Expected number of distinct birthdays

364
P( X i = 0) =

365
P( X i = 1) = 1 P( X i = 0)

40

The expected number of distinct birthdays is


364 100
= 87.6
E (X ) = E( X1 + X 2 + L + X 365) = 3651
365

25

35

30

25

250

200

150

100

50
20
25

30

35

40

45

50

100

150
200
Number of persons

250

300

350

26

Taken fromTijms, Understanding probability

Other Properties

Other Properties

If X is a non negative random variable,


then E( X ) 0

What do we mean by
X Y

0.1

0.09

then E ( X ) E ( Y )

0.08
0.07
0.06
0.05

Y : X :

0.04
0.03
0.02

0.01

X Y

61

56

51

46

41

36

31

26

21

16

Also

11

1 X(.) 0
Y(.)

then E ( X ) E (Y )

1 P(X)
P(Y)

E ( X ) = xP ( X = x )
x X

27

28

Caveats of intuition

Caveats of intuition

Does the expectation always exist?


Mean ( X ) =

1
N

Does the expectation always exist?


Example: A Cauchy random variable

Sample mean always exists

Mean ( X ) =

x X

E ( X ) = xP ( X = x )
x X

x
P( X = x ) =
e
x!

P( X = x) =

Expectation
perhaps gives
an infinite
value !!!!

1
x

http://physicsweb.org/articles/world/14/7/9/1#pw1407091
The physics of the Web
July 2001

1
N

x
x X

E(X ) =

xP ( X

P( X = x ) =

1
1+ x 2

P( X = x) =
29

= x)

x X

1
x

( < x < +)
E (X )

http://physicsweb.org/articles/world/14/7/9/1#pw1407091
The physics of the Web
July 2001

30

Caveats of intuition
Why is infinite?
Divergent series

Caveats of intuition

x
E ( X ) = xP ( X = x) =
2
x X
x =1 1 + x

Why is infinite?
Value of the harmonic series

10
9.5
9
8.5

8
7.5
7

Note that for high values of x

x
E(X ) =

1
+
x2
x =1

x 2 >> 1
x =1

6
5.5

1
1 1 1 1 1 1 1 1
x = 1+ 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9L
x =1

1000

2000

3000 4000

5000

6000

7000 8000

9000

10000

REPASSAR

1
E(X )
x =1 x

5
0

x
11
+
1 + x 2 x 2 >>1 x

) (

x =1

6.5

P( X = x ) =

1
1 1 1 1 1 1 1 1
x > 1+ 2 + 2 + 4 + 4 + 4 + 4 + 8 + 8L
x =1

1
x

31

Expected waiting times

x > 1+

1 L=

x =1

32

Expected value of a Binomial

Geometric
Pareto
gausian

33

34

Anda mungkin juga menyukai