Added to Favorites

Popular Searches

In statistics, mean has two related meanings: ## Examples of means

### Arithmetic mean

The arithmetic mean is the "standard" average, often simply called the "mean".### Geometric mean

The geometric mean is an average that is useful for sets of positive numbers that are interpreted according to their product and not their sum (as is the case with the arithmetic mean). For example rates of growth.### Harmonic mean

The harmonic mean is an average which is useful for sets of numbers which are defined in relation to some unit, for example speed (distance per unit of time). ### Generalized means

#### Power mean

The generalized mean, also known as the power mean or Hölder mean, is an abstraction of the quadratic, arithmetic, geometric and harmonic means. It is defined for a set of n positive numbers x_{i} by

#### f-mean

This can be generalized further as the generalized f-mean

### Weighted arithmetic mean

The weighted arithmetic mean is used, if one wants to combine average values from samples of the same population with different sample sizes:### Truncated mean

Sometimes a set of numbers might contain outliers, i.e. a datum which is much lower or much higher than the others.
Often, outliers are erroneous data caused by artifacts. In this case one can use a truncated mean. It involves discarding given parts of the data at the top or the bottom end, typically an equal amount at each end, and then taking the arithmetic mean of the remaining data. The number of values removed is indicated as a percentage of total number of values.
### Interquartile mean

The interquartile mean is a specific example of a truncated mean. It is simply the arithmetic mean after removing the lowest and the highest quarter of values.
### Mean of a function

In calculus, and especially multivariable calculus, the mean of a function is loosely defined as the average value of the function over its domain. In one variable, the mean of a function f(x) over the interval (a,b) is defined by### Mean of angles

### Other means

## Properties

### Weighted mean

### Unweighted mean

### Convert unweighted mean to weighted mean

### Means of tuples of different sizes

## Population and sample means

The mean of a population has an expected value of μ, known as the population mean. The sample mean makes a good estimator of the population mean, as its expected value is the same as the population mean. The sample mean of a population is a random variable, not a constant, and consequently it will have its own distribution. For a random sample of n observations from a normally distributed population, the sample mean distribution is## Mathematics education

In many state and government curriculum standards, students are traditionally expected to learn either the meaning or formula for computing the mean by the fourth grade. However, in many standards-based mathematics curricula, students are encouraged to invent their own methods, and may not be taught the traditional method. Reform based texts such as TERC in fact discourage teaching the traditional "add the numbers and divide by the number of items" method in favor of spending more time on the concept of median, which does not require division. However, mean can be computed with a simple four-function calculator, while median requires an abacus. The same teacher guide devotes several pages on how to find the median of a set, which is judged to be simpler than finding the mean.
## See also

## References

## External links

- the arithmetic mean (and is distinguished from the geometric mean or harmonic mean).
- the expected value of a random variable, which is also called the population mean.

It is sometimes stated that the 'mean' means average. This is incorrect if "mean" is taken in the specific sense of "arithmetic mean" as there are different types of averages: the mean, median, and mode. For instance, average house prices almost always use the median value for the average.

For a real-valued random variable X, the mean is the expectation of X. Note that not every probability distribution has a defined mean (or variance); see the Cauchy distribution for an example.

For a data set, the mean is the sum of the observations divided by the number of observations. The mean is often quoted along with the standard deviation: the mean describes the central location of the data, and the standard deviation describes the spread.

An alternative measure of dispersion is the mean deviation, equivalent to the average absolute deviation from the mean. It is less sensitive to outliers, but less mathematically tractable.

As well as statistics, means are often used in geometry and analysis; a wide range of means have been developed for these purposes, which are not much used in statistics. These are listed below.

- $bar\{x\}\; =\; frac\{1\}\{n\}cdot\; sum\_\{i=1\}^n\{x\_i\}$

The mean may often be confused with the median or mode. The mean is the arithmetic average of a set of values, or distribution; however, for skewed distributions, the mean is not necessarily the same as the middle value (median), or the most likely (mode). For example, mean income is skewed upwards by a small number of people with very large incomes, so that the majority have an income lower than the mean. By contrast, the median income is the level at which half the population is below and half is above. The mode income is the most likely income, and favors the larger number of people with lower incomes. The median or mode are often more intuitive measures of such data.

Nevertheless, many skewed distributions are best described by their mean - such as the Exponential and Poisson distributions.

For example, the arithmetic mean of six values: 34, 27, 45, 55, 22, 34 is:

- $frac\{34+27+45+55+22+34\}\{6\}\; =\; frac\{217\}\{6\}\; approx\; 36.167.$

- $bar\{x\}\; =\; left\; (prod\_\{i=1\}^n\{x\_i\}\; right\; )\; ^\{1/n\}$

For example, the geometric mean of six values: 34, 27, 45, 55, 22, 34 is:

- $(34\; cdot\; 27\; cdot\; 45\; cdot\; 55\; cdot\; 22\; cdot\; 34)^\{1/6\}\; =\; 1,699,493,400^\{1/6\}\; =\; 34.545.$

- $bar\{x\}\; =\; n\; cdot\; left\; (sum\_\{i=1\}^n\; frac\{1\}\{x\_i\}\; right\; )\; ^\{-1\}$

For example, the harmonic mean of the six values: 34, 27, 45, 55, 22, and 34 is

- $frac\{6\}\{frac\{1\}\{34\}+frac\{1\}\{27\}+frac\{1\}\{45\}\; +\; frac\{1\}\{55\}\; +\; frac\{1\}\{22\}+frac\{1\}\{34\}\}\; =\; frac\{60588\}\{1835\}\; approx\; 33.0179836.$

- $bar\{x\}(m)\; =\; left\; (frac\{1\}\{n\}cdotsum\_\{i=1\}^n\{x\_i^m\}\; right\; )\; ^\{1/m\}$

By choosing the appropriate value for the parameter m we get

$mrightarrowinfty$ | maximum |

$m=2$ | quadratic mean, |

$m=1$ | arithmetic mean, |

$mrightarrow0$ | geometric mean, |

$m=-1$ | harmonic mean, |

$mrightarrow-infty$ | minimum. |

- $bar\{x\}\; =\; f^\{-1\}left(\{frac\{1\}\{n\}cdotsum\_\{i=1\}^n\{f(x\_i)\}\}right)$

and again a suitable choice of an invertible $f$ will give

$f(x)\; =\; x$arithmetic mean, | |

$f(x)\; =\; frac\{1\}\{x\}$ | harmonic mean, |

$f(x)\; =\; x^m$ | power mean, |

$f(x)\; =\; ln\; x$ | geometric mean. |

- $bar\{x\}\; =\; frac\{sum\_\{i=1\}^n\{w\_i\; cdot\; x\_i\}\}\{sum\_\{i=1\}^n\; \{w\_i\}\}.$

The weights $w\_i$ represent the bounds of the partial sample. In other applications they represent a measure for the reliability of the influence upon the mean by respective values.

- $bar\{x\}\; =\; \{2\; over\; n\}\; sum\_\{i=(n/4)+1\}^\{3n/4\}\{x\_i\}$

- $bar\{f\}=frac\{1\}\{b-a\}int\_a^bf(x),dx.$

(See also mean value theorem.) In several variables, the mean over a relatively compact domain U in a Euclidean space is defined by

- $bar\{f\}=frac\{1\}\{hbox\{Vol\}(U)\}int\_U\; f.$

This generalizes the arithmetic mean. On the other hand, it is also possible to generalize the geometric mean to functions by defining the geometric mean of f to be

- $expleft(frac\{1\}\{hbox\{Vol\}(U)\}int\_U\; log\; fright).$

More generally, in measure theory and probability theory either sort of mean plays an important role. In this context, Jensen's inequality places sharp estimates on the relationship between these two different notions of the mean of a function.

Most of the usual means fail on circular quantities, like angles, daytimes, fractional parts of real numbers. For those quantities you need a mean of circular quantities.

- Arithmetic-geometric mean
- Arithmetic-harmonic mean
- Cesàro mean
- Chisini mean
- Contraharmonic mean
- Elementary symmetric mean
- Geometric-harmonic mean
- Heinz mean
- Heronian mean
- Identric mean
- Least squares mean
- Lehmer mean
- Logarithmic mean
- Median
- Moving average
- Root mean square
- Stolarsky mean
- Weighted geometric mean
- Weighted harmonic mean
- Rényi's entropy (a generalized f-mean)

All means share some properties and additional properties are shared by the most common means. Some of these properties are collected here.

A weighted mean $M$ is a function which maps tuples of positive numbers to a positive number ($mathbb\{R\}\_\{>0\}^ntomathbb\{R\}\_\{>0\}$).

- "Fixed point": $M(1,1,dots,1)\; =\; 1$
- Homogeneity: $foralllambda\; forall\; x\; M(lambdacdot\; x\_1,\; dots,\; lambdacdot\; x\_n)\; =\; lambda\; cdot\; M(x\_1,\; dots,\; x\_n)$

- (using vector notation: $foralllambda\; forall\; x\; M(lambdacdot\; x)\; =\; lambda\; cdot\; M\; x$)

- Monotony: $forall\; x\; forall\; y\; (forall\; i\; x\_i\; le\; y\_i)\; Rightarrow\; M\; x\; le\; M\; y$

It follows

- Boundedness: $forall\; x\; M\; x\; in\; [min\; x,\; max\; x]$
- Continuity: $lim\_\{xto\; y\}\; M\; x\; =\; M\; y$

- Sketch of a proof: Because $forall\; x\; forall\; y\; left(||x-y||\_inftylelambdacdotmin\; y\; Rightarrow\; (forall\; i\; |x\_i-y\_i|lelambdacdot\; y\_i)\; Rightarrow\; (forall\; i\; x\_ile\; y\_i+lambdacdot\; y\_i)\; Rightarrow\; M\; x\; le(1+lambda)cdot\; M\; yright)$ it follows $forall\; x\; forall\; y\; Mxge\; My\; Rightarrow\; forall\; varepsilon>0\; ||x-y||\_inftylefrac\{varepsiloncdotmin\; y\}\{M\; y\}\; Rightarrow\; |Mx-My|levarepsilon$.

- There are means which are not differentiable. For instance, the maximum number of a tuple is considered a mean (as an extreme case of the power mean, or as a special case of a median), but is not differentiable.
- All means listed above, with the exception of most of the Generalized f-means, satisfy the presented properties.
- If $f$ is bijective, then the generalized f-mean satisfies the fixed point property.
- If $f$ is strictly monotonic, then the generalized f-mean satisfy also the monotony property.
- In general a generalized f-mean will miss homogeneity.

The above properties imply techniques to construct more complex means:

If $C,\; M\_1,\; dots,\; M\_m$ are weighted means, $p$ is a positive real number, then $A,\; B$ with

- $forall\; x\; A\; x\; =\; C(M\_1\; x,\; dots,\; M\_m\; x)$

- $forall\; x\; B\; x\; =\; sqrt[p]\{C(x\_1^p,\; dots,\; x\_n^p)\}$

Intuitively spoken, an unweighted mean is a weighted mean with equal weights. Since our definition of weighted mean above does not expose particular weights, equal weights must be asserted by a different way. A different view on homogeneous weighting is, that the inputs can be swapped without altering the result.

Thus we define $M$ being an unweighted mean if it is a weighted mean and for each permutation $pi$ of inputs, the result is the same. Let $P$ be the set of permutations of $n$-tuples.

- Symmetry: $forall\; x\; forall\; piin\; P\; M\; x\; =\; M(pi\; x)$

Analogously to the weighted means, if $C$ is a weighted mean and $M\_1,\; dots,\; M\_m$ are unweighted means, $p$ is a positive real number, then $A,\; B$ with

- $forall\; x\; A\; x\; =\; C(M\_1\; x,\; dots,\; M\_m\; x)$

- $forall\; x\; B\; x\; =\; sqrt[p]\{M\_1(x\_1^p,\; dots,\; x\_n^p)\}$

An unweighted mean can be turned into a weighted mean by repeating elements. This connection can also be used to state that a mean is the weighted version of an unweighted mean. Say you have the unweighted mean $M$ and weight the numbers by natural numbers $a\_1,dots,a\_n$. (If the numbers are rational, then multiply them with the least common denominator.) Then the corresponding weighted mean $A$ is obtained by

- $A(x\_1,dots,x\_n)\; =\; M(underbrace\{x\_1,dots,x\_1\}\_\{a\_1\},x\_2,dots,x\_\{n-1\},underbrace\{x\_n,dots,x\_n\}\_\{a\_n\}).$

If a mean $M$ is defined for tuples of several sizes, then one also expects that the mean of a tuple is bounded by the means of partitions. More precisely

- Given an arbitrary tuple $x$, which is partitioned into $y\_1,\; dots,\; y\_k$, then it holds $M\; x\; in\; mathrm\{convexhull\}(M\; y\_1,\; dots,\; M\; y\_k)$. (See Convex hull)

- $bar\{x\}\; thicksim\; Nleft\{mu,\; frac\{sigma^2\}\{n\}right\}.$

Often, since the population variance is an unknown parameter, it is estimated by the mean sum of squares, which changes the distribution of the sample mean from a normal distribution to a Student's t distribution with n − 1 degrees of freedom.

- Average, same as central tendency
- Descriptive statistics
- Kurtosis
- Median
- Mode (statistics)
- Summary statistics
- Law of averages
- Spherical mean
- For an independent identical distribution from the reals, the mean of a sample is an unbiased estimator for the mean of the population.

- An easy-to-follow guide to understanding & calculating the mean
- Comparison between arithmetic and geometric mean of two numbers

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Thursday October 09, 2008 at 08:36:26 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Thursday October 09, 2008 at 08:36:26 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.