Definitions
Nearby Words

# minimum

In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator (often abbreviated as UMVU or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. Consider estimation of $g\left(theta\right)$ based on data $X_1, X_2, ldots, X_n$ i.i.d. from some family of densities $p_theta, theta in Omega$, where $Omega$ is the parameter space. An unbiased estimator $delta\left(X_1, X_2, ldots, X_n\right)$ of $g\left(theta\right)$ is UMVU if $forall theta in Omega$,

$mathrm\left\{var\right\}\left(delta\left(X_1, X_2, ldots, X_n\right)\right) leq mathrm\left\{var\right\}\left(tilde\left\{delta\right\}\left(X_1, X_2, ldots, X_n\right)\right)$

for any other unbiased estimator $tilde\left\{delta\right\}.$

If an unbiased estimator of $g\left(theta\right)$ exists, then one can prove there is an essentially unique MVUE estimator. Using the Rao-Blackwell theorem one can also prove that determining the MVUE estimator is simply a matter of finding a complete sufficient statistic for the family $p_theta, theta in Omega$ and conditioning any unbiased estimator on it. Put formally, suppose $delta\left(X_1, X_2, ldots, X_n\right)$ is unbiased for $g\left(theta\right)$, and that $T$ is a complete sufficient statistic for the family of densities. Then

$eta\left(X_1, X_2, ldots, X_n\right) = mathrm\left\{E\right\}\left(delta\left(X_1, X_2, ldots, X_n\right)|T\right),$

is the MVUE estimator for $g\left(theta\right).$

## Estimator selection

An efficient estimator need not exist, but if it does, it's the MVUE. Since the mean squared error (MSE) of an estimator $delta$ is

$MSE\left(delta\right) = mathrm\left\{var\right\}\left(delta\right) + mathrm\left\{bias\right\}\left(delta\right)^\left\{2\right\}$

the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE; see estimator bias.

## Example

Consider the data to be a single observation from an absolutely continuous distribution on $mathbb\left\{R\right\}$ with density

$p_theta\left(x\right) = frac\left\{ theta e^\left\{-x\right\} \right\}\left\{\left(1 + e^\left\{-x\right\}\right)^\left\{theta + 1\right\} \right\}$

and we wish to find the UMVU estimator of

$g\left(theta\right) = frac\left\{1\right\}\left\{theta^\left\{2\right\}\right\}$

First we recognize that the density can be written as

$frac\left\{ e^\left\{-x\right\} \right\} \left\{ 1 + e^\left\{-x\right\} \right\} exp\left(-theta log\left(1 + e^\left\{-x\right\}\right) + log\left(theta\right)\right)$

Which is an exponential family with sufficient statistic $T = mathrm\left\{log\right\}\left(1 + e^\left\{-x\right\}\right)$. In fact this is a full rank exponential family, and therefore $T$ is complete sufficient. See exponential family for a derivation which shows

$mathrm\left\{E\right\}\left(T\right) = frac\left\{1\right\}\left\{theta\right\}, mathrm\left\{var\right\}\left(T\right) = frac\left\{1\right\}\left\{theta^\left\{2\right\}\right\}$

Therefore

$mathrm\left\{E\right\}\left(T^2\right) = frac\left\{2\right\}\left\{theta^\left\{2\right\}\right\}$

Clearly $delta\left(X\right) = frac\left\{T^2\right\}\left\{2\right\}$ is unbiased, thus the UMVU estimator is

$eta\left(X\right) = mathrm\left\{E\right\}\left(delta\left(X\right) | T\right) = mathrm\left\{E\right\}\left(frac\left\{T^2\right\}\left\{2\right\} | T\right) = frac\left\{T^\left\{2\right\}\right\}\left\{2\right\} = frac\left\{log\left(1 + e^\left\{-X\right\}\right)^\left\{2\right\}\right\}\left\{2\right\}$

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU.