Anda di halaman 1dari 2

Math 2274 Lecture 19 - Linear Combinations of Several Random Variables

Recall : Theorem 1: E(aX + bY) = aE(X) + bE(Y)

Theorem 2: Var(aX + bY) = a2Var(X) + b2Var(Y) + 2abCov(X,Y)

Proof: See pg. 114-115, Lecture notes.

Corollary: If X and Y are independent, then Var (aX + bY) = a2Var(X) + b2Var(Y)

It follows that:

Theorem 3: Let X1, X2, ..., Xn be a collection of n random variables with finite means and
variances, 1, 2, ...,n and 12, 22, ..., n2 respectively. Let

X = a1X1 + a2X2 +....+ anXn = =1

Then:

E(X) = E(a1X1 + a2X2 +....+ anXn) = =1


Var(X) = Var(a1X1 + a2X2 +....+ anXn)
= a12Var(X1) + a22Var(X2) + ...+ an2Var(Xn) +
2a1a2Cov(X1,X2) + 2a1a3Cov(X1,X3) + ...+ 2an-1anCov(Xn-1,Xn)
= =1 2 ( ) + 2 ( , )
= =1 2 2 + 2

Proof: See pg 117, Lecture notes.

Theorem 4: Let X1, X2, ..., Xn be a collection of n independent random variables with finite
means and variances, 1, 2, ...,n and 12, 22, ..., n2 respectively. Let

X = a1X1 + a2X2 +....+ anXn = =1

Then:

E(X) = E(a1X1 + a2X2 +....+ anXn) = =1


Var(X) = Var(a1X1 + a2X2 +....+ anXn)
= a12Var(X1) + a22Var(X2) + ...+ an2Var(Xn) +
= =1 2 ( )
= =1 2 2

Proof: Same as before, however, for Var(X), since Xi and Xj are independent, then Cov(Xi, Xj) =
0.
1) Linear Combinations of Binomial Random Variables

Let X1, X2, ..., Xn be a sequence of n independent Bernoulli rvs.


Each Xi has the same probability of success, p where o<p<1.
Let X = =1 . In other words, X is the sum of n i.i.d (independent and identically
distributed) Bernoulli(p) rvs.
X number of successes in n independent Bernoulli trials with the same probability of
success p.
Hence X ~ Binomial (n,p).
Converse of this is also true If X ~ Binomial (n,p), then the distribution of X is the
same as the sum of n i.i.d Bernoulli(p) rvs.

Theorem 5: Let X ~ Binomial (n,p). Then

i) E(X) = np. ii) Var(X) = np(1-p)

Proof: See pg 118, Lecture notes.

2) Linear Combinations of Normal Random Variables

Theorem 6: Let X1, X2, ..., Xn be a sequence of n independent normal rvs., i.e. Xi ~ N(i, i2),
for i = 1,...n. Let X = =1 . Then, X ~ N(, 2), where

i) = E(X) = =1 ii) 2 = var(X) = =1 2 2

Proof: Similar to proof of Theorem 4. Try as an exercise!

Example: The rvs. X1, X2 and X3 are i.i.d where X1 ~ N(40,9), X2 ~ N(50, 16) and X3 ~ N(60, 9).

Find the following:

i) P(2X1+X3 > 3X2) ii) P(X1 > X2) iii) P(X1 + 2X2 X3 > 90)

General Method:

Ensure that the rvs. are i.i.d.


Let T be the linear combination of rvs. in question , e.g. in ii) above T = X1 X2 .
Since T is a linear combination of i.i.d rvs. of said distribution (e.g. the normal
distribution), first find the E(T) and Var(T) using the relevant theorem shown previously.
State the distribution of T e.g. T ~ N(T , 2 ).
Hence, find the relevant probability using the distribution of T, e.g. in ii) above find
P(T>0).
Note: P(X1 > X2) = P(X1 - X2 > 0) = P(T>0).

Anda mungkin juga menyukai