Definitions

# moment

[moh-muhnt]
moment, in physics and engineering, term designating the product of a quantity and a distance (or some power of the distance) to some point associated with that quantity. The most theoretically useful moments are moments of masses, areas, lines, and forces, including magnetic force. The concept of torque (propensity to turn about a point) is the moment of force. If a force tends to rotate a body about some point, then the moment, or turning effect, is the product of the force and the distance from the point to the direction of the force. The application of this concept is illustrated by pushing open a door: the farther from the hinge the push is applied, the less force is required. The principle of the moment of a force is perhaps best seen in the use of a lever. Extensions of this concept are important in mechanics, in topics such as inertia, center of gravity, equilibrium, and stability of structures, and in architectural problems. The moment of inertia of a body about a point is the sum, for each particle in the body, of the mass of the particle and the square of its distance from the point. The angular momentum of a body about a fixed axis is equal to the product of the momentum and the length of the moment arm (distance from the body to the axis). A torque acting on a rigid body acts to change its angular momentum by producing an angular acceleration.

Quantitative measure of the rotational inertia of a body. As a rotating body spins about an external or internal axis (either fixed or unfixed), it opposes any change in the body's speed of rotation that may be caused by a torque. It is defined as the sum of the products obtained by multiplying the mass of each particle of matter in a given body by the square of its distance from the axis of rotation.

In probability theory and statistics, the moment-generating function of a random variable X is

$M_X\left(t\right)=operatorname\left\{E\right\}left\left(e^\left\{tX\right\}right\right), quad t in mathbb\left\{R\right\},$

wherever this expectation exists. The moment-generating function generates the moments of the probability distribution.

## Calculation

If X has a continuous probability density function f(x) then the moment generating function is given by

$M_X\left(t\right) = int_\left\{-infty\right\}^infty e^\left\{tx\right\} f\left(x\right),mathrm\left\{d\right\}x$
$= int_\left\{-infty\right\}^infty left\left(1+ tx + frac\left\{t^2x^2\right\}\left\{2!\right\} + cdotsright\right) f\left(x\right),mathrm\left\{d\right\}x$
$= 1 + tm_1 + frac\left\{t^2m_2\right\}\left\{2!\right\} +cdots,$

where $m_i$ is the ith moment. $M_X\left(-t\right)$ is just the two-sided Laplace transform of f(x).

Regardless of whether the probability distribution is continuous or not, the moment-generating function is given by the Riemann-Stieltjes integral

$M_X\left(t\right) = int_\left\{-infty\right\}^infty e^\left\{tx\right\},dF\left(x\right)$

where F is the cumulative distribution function.

If X1, X2, ..., Xn is a sequence of independent (and not necessarily identically distributed) random variables, and

$S_n = sum_\left\{i=1\right\}^n a_i X_i,$

where the ai are constants, then the probability density function for Sn is the convolution of the probability density functions of each of the Xi and the moment-generating function for Sn is given by


M_{S_n}(t)=M_{X_1}(a_1t)M_{X_2}(a_2t)cdots M_{X_n}(a_nt).

For vector-valued random variables X with real components, the moment-generating function is given by

$M_X\left(mathbf\left\{t\right\}\right) = operatorname\left\{E\right\}left\left(e^\left\{langle mathbf\left\{t\right\}, mathbf\left\{X\right\}rangle\right\}right\right)$

where t is a vector and $langle mathbf\left\{t\right\} , mathbf\left\{X\right\}rangle$ is the dot product.

## Significance

Provided the moment-generating function exists in an open interval around t = 0, the nth moment is given by

$operatorname\left\{E\right\}left\left(X^nright\right)=M_X^\left\{\left(n\right)\right\}\left(0\right)=left.frac\left\{mathrm\left\{d\right\}^n M_X\left(t\right)\right\}\left\{mathrm\left\{d\right\}t^n\right\}right|_\left\{t=0\right\}.$

If the moment generating function is finite in such an interval, then it uniquely determines a probability distribution.

Related to the moment-generating function are a number of other transforms that are common in probability theory, including the characteristic function and the probability-generating function.

The cumulant-generating function is the logarithm of the moment-generating function.