The kth moment of a random variable X is given by E[Xk]. The kth central moment of a random variable X is given by E[(X-E[X])k]. The moment generating function of X is given by: (9)
yields: (11)
We can then find the kth moment of X by taking the kth derivative of the moment generating function and setting . (13)
For the Laplace transform, the moments can be found using: (14)
Example:
(15)
= = =
(19)
(20)
(21)
(22)
(23)
(25)
A property of transforms, known as the convolution theorem is stated as follows: Let be mutually independent random variables. Let exists for all i, then exists, and: (26) . If
Example: Let X1 and X2 be independent exponentially distributed random variables with parameters and respectively. Let Y = X1+X2. Find the distribution of Y.
(28)
where: (31)
(32)
The expected values E(X), E(X2), E(X3), ..., and E(Xr) are called moments. As you have already experienced in some cases, the mean: = E(X) and the variance: 2 = Var(X) = E(X2) 2 which are functions of moments, are sometimes difficult to find. Special functions, called moment-generating functions can sometimes make finding the mean and variance of a random variable simpler. In this lesson, we'll first learn what a moment-generating function is, and then we'll earn how to use moment generating functions (abbreviated "m.g.f."):
to find moments and functions of moments, such as and 2 to identify which probability mass function a random variable X follows
Objectives
To learn the definition of a moment-generating function. To find the moment-generating function of a binomial random variable. To learn how to use a moment-generating function to find the mean and variance of a random variable. To learn how to use a moment-generating function to identify which probability mass function a random variable X follows. To understand the steps involved in each of the proofs in the lesson. To be able to apply the methods learned in the lesson to new problems.
What is an MGF?
Definition. Let X be a discrete random variable with probability mass function f(x) and support S. Then:
is the moment generating function of X as long as the summation is finite for some interval of t around 0. That is, M(t) is the moment generating function ("m.g.f.") of X if there is a positive number h such that the above summation exists and is finite for h < t < h.
Example
What is the moment generating function of a binomial random variable X? Once we find the moment generating function of a random variable, we can use it to... tada!... generate moments! Lectures on Probability, Statistics and Econometrics Home > Additional topics in probability theory
Note that the above integral is finite for for any , so that possesses a moment generating function:
where is the -th derivative of with respect to . Proving the above proposition is quite complicated, because a lot of analytical details must be taken care of (see e.g. Pfeiffer, P. E. (1978) Concepts of probability theory, Courier Dover Publications). The intuition, however, is straightforward: since the expected value is a linear operator and differentiation is a linear operation, under appropriate conditions one can differentiate through the expected value, as follows: which, evaluated at the point , yields: Example_ Continuing the example above, the moment generating function of an exponential random variable is:The expected value of can be computed by taking the
The second moment of can be computed by taking the second derivative of the moment generating function: and evaluating it at :