Definitions
Nearby Words

# minimum wage

Wage rate established by collective bargaining or by government regulation, specifying the lowest rate at which workers may be employed. A legal minimum wage is one mandated by government for all workers in an economy, with few exceptions. Privately negotiated minimum wages determined by collective bargaining apply to a specific group of workers in the economy, usually in specific trades or industries. The modern minimum wage, combined with compulsory arbitration of labour disputes, first appeared in Australia and New Zealand in the 1890s. In 1909 Britain established trade boards to set minimum wage rates in certain trades and industries. The first minimum wage in the U.S. (which applied only to women) was enacted by Massachusetts in 1912. Minimum wage laws or agreements now exist in most nations.

In mathematics, a point at which the value of a function is lowest. If the value is less than or equal to all other function values, it is an absolute minimum. If it is merely less than at any nearby point, it is a relative, or local, minimum. In calculus, the derivative equals zero or does not exist at a function's minimum point. Seealso maximum, optimization.

In statistics a uniformly minimum-variance unbiased estimator or minimum-variance unbiased estimator (often abbreviated as UMVU or MVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. Consider estimation of $g\left(theta\right)$ based on data $X_1, X_2, ldots, X_n$ i.i.d. from some family of densities $p_theta, theta in Omega$, where $Omega$ is the parameter space. An unbiased estimator $delta\left(X_1, X_2, ldots, X_n\right)$ of $g\left(theta\right)$ is UMVU if $forall theta in Omega$,

$mathrm\left\{var\right\}\left(delta\left(X_1, X_2, ldots, X_n\right)\right) leq mathrm\left\{var\right\}\left(tilde\left\{delta\right\}\left(X_1, X_2, ldots, X_n\right)\right)$

for any other unbiased estimator $tilde\left\{delta\right\}.$

If an unbiased estimator of $g\left(theta\right)$ exists, then one can prove there is an essentially unique MVUE estimator. Using the Rao-Blackwell theorem one can also prove that determining the MVUE estimator is simply a matter of finding a complete sufficient statistic for the family $p_theta, theta in Omega$ and conditioning any unbiased estimator on it. Put formally, suppose $delta\left(X_1, X_2, ldots, X_n\right)$ is unbiased for $g\left(theta\right)$, and that $T$ is a complete sufficient statistic for the family of densities. Then

$eta\left(X_1, X_2, ldots, X_n\right) = mathrm\left\{E\right\}\left(delta\left(X_1, X_2, ldots, X_n\right)|T\right),$

is the MVUE estimator for $g\left(theta\right).$

## Estimator selection

An efficient estimator need not exist, but if it does, it's the MVUE. Since the mean squared error (MSE) of an estimator $delta$ is

$MSE\left(delta\right) = mathrm\left\{var\right\}\left(delta\right) + mathrm\left\{bias\right\}\left(delta\right)^\left\{2\right\}$

the MVUE minimizes MSE among unbiased estimators. In some cases biased estimators have lower MSE; see estimator bias.

## Example

Consider the data to be a single observation from an absolutely continuous distribution on $mathbb\left\{R\right\}$ with density

$p_theta\left(x\right) = frac\left\{ theta e^\left\{-x\right\} \right\}\left\{\left(1 + e^\left\{-x\right\}\right)^\left\{theta + 1\right\} \right\}$

and we wish to find the UMVU estimator of

$g\left(theta\right) = frac\left\{1\right\}\left\{theta^\left\{2\right\}\right\}$

First we recognize that the density can be written as

$frac\left\{ e^\left\{-x\right\} \right\} \left\{ 1 + e^\left\{-x\right\} \right\} exp\left(-theta log\left(1 + e^\left\{-x\right\}\right) + log\left(theta\right)\right)$

Which is an exponential family with sufficient statistic $T = mathrm\left\{log\right\}\left(1 + e^\left\{-x\right\}\right)$. In fact this is a full rank exponential family, and therefore $T$ is complete sufficient. See exponential family for a derivation which shows

$mathrm\left\{E\right\}\left(T\right) = frac\left\{1\right\}\left\{theta\right\}, mathrm\left\{var\right\}\left(T\right) = frac\left\{1\right\}\left\{theta^\left\{2\right\}\right\}$

Therefore

$mathrm\left\{E\right\}\left(T^2\right) = frac\left\{2\right\}\left\{theta^\left\{2\right\}\right\}$

Clearly $delta\left(X\right) = frac\left\{T^2\right\}\left\{2\right\}$ is unbiased, thus the UMVU estimator is

$eta\left(X\right) = mathrm\left\{E\right\}\left(delta\left(X\right) | T\right) = mathrm\left\{E\right\}\left(frac\left\{T^2\right\}\left\{2\right\} | T\right) = frac\left\{T^\left\{2\right\}\right\}\left\{2\right\} = frac\left\{log\left(1 + e^\left\{-X\right\}\right)^\left\{2\right\}\right\}\left\{2\right\}$

This example illustrates that an unbiased function of the complete sufficient statistic will be UMVU.