Moment Generating Function

views updated

Moment Generating Function

BIBLIOGRAPHY

The moment generating function of the random variable X, provided it exists, is Mx(t) = E [etx] where E [g(X)] denotes the expectation of the function g(X ). For example, if the random variable X follows the normal distribution with mean μ, and variance σ2, the moment generating function of X is MX (t ) = e μt +σ 2t 2/2

The moment generating function has two main uses. First, as the name implies, it can be used to obtain the moments of a random variable. Specifically, the k moment of the random variable X, α k = E [X k ], is given by where is the kth derivative of MX(t ) evaluated at t = 0. For example, if X is normally distributed with mean μ, and variance σ2, and hence moment generating function MX (t ) = e µt + σ 2t 2/2, it follows that

and

It follows that the first moment of X is μ, and the second moment of X is μ2 + σ2

The second, and perhaps more important, use of the moment generating function derives from the fact that the moment generating function uniquely identifies the distribution function of a random variable. Thus, if MX1 (t ) = MX2 (t ) then Pr (X1 x ) = (X2 x ). For example, if the random variable X has the moment generating function MX (t ) = e μt + σ 2t2/2 then X necessarily follows the normal distribution. This property of the moment generating function can sometimes be used to determine the distribution of the limit of a sequence of random variables. Consider, for example, a sequence of random variables {Yn ; n = 1, 2,} with distribution functions {Fn (y ); n = 1, 2, } and corresponding moment generating functions {M n (t ); n = 1, 2 }. If limn Mn (t ), where M(t ) is the moment generating function of a random variable Y with distribution function F(y ) = Pr (Y y ), then limn Fn(y ) = F(y ). F(y ) is called the limiting distribution of the sequence {Yn ; n = 1, 2, } and Yn is said to converge in distribution to Y. For n sufficiently large, F(y ) provides a good approximation to the distribution of Yn. For example, consider the sequence of sample means {X̄n ; n = 1, 2, } obtained from random samples of size n from a population with mean μ, and variance σ2. Under certain conditions, the standardized sequence converges in distribution to a standard normal random variable. This result, referred to as a central limit theorem for the sample mean, is typically obtained by showing that the sequence of corresponding moment generating functions {M X ̄n (t )} converges to M(t ) = e t 2/2, the moment generating function of a normal random variable with mean zero and variance 1, that is, a standard normal random variable.

Closely related to the moment generating function is the so-called characteristic function. The characteristic function of the random variable X is Cx(t ) = E [eitX ] where and eitX = cos(tX ) + isin(tX ). The advantage of the characteristic function is that it always exists whereas the moment generating function may not. If the moment generating function exists, the characteristic function is related to it by CX(t ) = MX(it ). Thus, for example, the characteristic function of a normally distributed random variable with mean μ, and variance σ2 is CX (t ) = e itµσ 2t 2/2. Like the moment generating function, the characteristic function can be used to obtain moments of the random variable. And since it uniquely identifies the distribution of the random variable, it can be used to obtain the limiting distribution of sequences of random variables.

Elementary mathematical treatments of the moment generating function are given in Donald A. Berry and Bernard W. Lindgren and John E. Freund. Intermediate treatments may be found in Robert V. Hogg and Allen T. Craig, and Lindgren and advanced mathematical discussion can be found in Harald Cramér and M. Loève. David attributes the first occurrences in print of the term moment generating function to Henri Poincaré (1912 in French) and Cecil C. Craig (1936 in English).

BIBLIOGRAPHY

Berry, Donald A., and Bernard W. Lindgren. 1996. Statistics: Theory and Methods. 2nd ed. Belmont, CA: Duxbury Press at Wadsworth Publishing.

Cramér, Harald. 1946. Mathematical Methods of Statistics. Princeton, NJ: Princeton University Press.

David, H. A. 1995. First (?) Occurrence of Common Terms in Mathematical Statistics. American Statistician 49 (2): 121133.

Freund, John E. 1992. Mathematical Statistics. 5th ed. Englewood Cliffs, NJ: Prentice Hall.

Hogg, Robert V., and Allen T. Craig. 1995. Introduction to Mathematical Statistics. 5th ed. Englewood Cliffs, NJ: Prentice Hall.

Lindgren, Bernard W. 1976. Statistical Theory. 3rd ed. New York: Macmillan.

Loève, M. 1977. Probability Theory I. 4th ed. New York: Springer-Verlag.

E. Philip Howrey