Definitions

# Moment-generating function

In probability theory and statistics, the moment-generating function of a random variable X is

$M_X\left(t\right)=operatorname\left\{E\right\}left\left(e^\left\{tX\right\}right\right), quad t in mathbb\left\{R\right\},$

wherever this expectation exists. The moment-generating function generates the moments of the probability distribution.

## Calculation

If X has a continuous probability density function f(x) then the moment generating function is given by

$M_X\left(t\right) = int_\left\{-infty\right\}^infty e^\left\{tx\right\} f\left(x\right),mathrm\left\{d\right\}x$
$= int_\left\{-infty\right\}^infty left\left(1+ tx + frac\left\{t^2x^2\right\}\left\{2!\right\} + cdotsright\right) f\left(x\right),mathrm\left\{d\right\}x$
$= 1 + tm_1 + frac\left\{t^2m_2\right\}\left\{2!\right\} +cdots,$

where $m_i$ is the ith moment. $M_X\left(-t\right)$ is just the two-sided Laplace transform of f(x).

Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral

$M_X\left(t\right) = int_\left\{-infty\right\}^infty e^\left\{tx\right\},dF\left(x\right)$

where F is the cumulative distribution function.

If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

$S_n = sum_\left\{i=1\right\}^n a_i X_i,$

where the ai are constants, then the probability density function for Sn is the convolution of the probability density functions of each of the Xi and the moment-generating function for Sn is given by


M_{S_n}(t)=M_{X_1}(a_1t)M_{X_2}(a_2t)cdots M_{X_n}(a_nt).

For vector-valued random variables X with real components, the moment-generating function is given by

$M_X\left(mathbf\left\{t\right\}\right) = operatorname\left\{E\right\}left\left(e^\left\{langle mathbf\left\{t\right\}, mathbf\left\{X\right\}rangle\right\}right\right)$

where t is a vector and $langle mathbf\left\{t\right\} , mathbf\left\{X\right\}rangle$ is the dot product.

## Significance

Provided the moment-generating function exists in an open interval around t = 0, the nth moment is given by

$operatorname\left\{E\right\}left\left(X^nright\right)=M_X^\left\{\left(n\right)\right\}\left(0\right)=left.frac\left\{mathrm\left\{d\right\}^n M_X\left(t\right)\right\}\left\{mathrm\left\{d\right\}t^n\right\}right|_\left\{t=0\right\}.$

If the moment generating function is finite in such an interval, then it uniquely determines a probability distribution.

Related to the moment-generating function are a number of other transforms that are common in probability theory, including the characteristic function and the probability-generating function.

The cumulant-generating function is the logarithm of the moment-generating function.