Q:

What is the definition of Chebyshev's theorem?

A:

Quick Answer

Chebyshev's theorem, or inequality, states that for any given data sample, the proportion of observations is at least (1-(1/k2)), where k equals the "within number" divided by the standard deviation. For this to work, k must equal at least 1. This theorem provides a way to know what percentage of data lies within the standard deviations from any data set.

Continue Reading

Full Answer

The inequality is derived from probability and can be applied to statistics. An example of a math problem involving Chebyshev's theorem is "Find what percent of values will fall between x and y for a data set with the mean of 'z' and standard deviation of 'a' using Chebyshev's theorem." For example: Find what percent of values will fall between 123 (x) and 179 (y) for a data set with mean of 151 (z) and standard deviation of 14 (a) using Chebyshev's theorem.

Subtract 123 from 151 to get 28, meaning 123 is 28 units below the mean. The same applies for 179-151. The "within number" is 28. Using the theorem, k equals 28/14 or 2. Plug in 2 for k, and the fraction breaks down to 1-(1/4) which equals 3/4. Convert the fraction to a percent, and the answer is 75 percent of the data values must be within two standard deviations of the mean.

Russian mathematician Pafnuty Chebyshev first stated this inequality in 1874, and 10 years later it was proved by Chebyshev's student Andrey Markov in his doctoral dissertation.

Learn more about Statistics

Related Questions

Explore