Corollary: If X and Y are independent, then Var (aX + bY) = a2Var(X) + b2Var(Y)
It follows that:
Theorem 3: Let X1, X2, ..., Xn be a collection of n random variables with finite means and
variances, 1, 2, ...,n and 12, 22, ..., n2 respectively. Let
Then:
Theorem 4: Let X1, X2, ..., Xn be a collection of n independent random variables with finite
means and variances, 1, 2, ...,n and 12, 22, ..., n2 respectively. Let
Then:
Proof: Same as before, however, for Var(X), since Xi and Xj are independent, then Cov(Xi, Xj) =
0.
1) Linear Combinations of Binomial Random Variables
Theorem 6: Let X1, X2, ..., Xn be a sequence of n independent normal rvs., i.e. Xi ~ N(i, i2),
for i = 1,...n. Let X = =1 . Then, X ~ N(, 2), where
Example: The rvs. X1, X2 and X3 are i.i.d where X1 ~ N(40,9), X2 ~ N(50, 16) and X3 ~ N(60, 9).
i) P(2X1+X3 > 3X2) ii) P(X1 > X2) iii) P(X1 + 2X2 X3 > 90)
General Method: