Definitions
Nearby Words

# Min-entropy

In probability theory or information theory, the min-entropy of a discrete random event x with possible states (or outcomes) 1... n and corresponding probabilities p1... pn is

$H_infty\left(X\right) = min_\left\{i=1\right\}^n \left(-log p_i\right) = -\left(max_i log p_i\right) = -log max_i p_i$

The base of the logarithm is just a scaling constant; for a result in bits, use a base-2 logarithm. Thus, a distribution has a min-entropy of at least b bits if no possible state has a probability greater than 2-b.

The min-entropy is always less than or equal to the Shannon entropy; it is equal when all the probabilities pi are equal. min-entropy is important in the theory of randomness extractors.

The notation $H_infty\left(X\right)$ derives from a parameterized family of Shannon-like entropy measures, Rényi entropy,

$H_k\left(X\right) = -log sqrt\left[k-1\right]\left\{begin\left\{matrix\right\}sum_i \left(p_i\right)^kend\left\{matrix\right\}\right\}$
k=1 is Shannon entropy. As k is increased, more weight is given to the larger probabilities, and in the limit as k→∞, only the largest p_i has any effect on the result.

## References

Search another word or see Min-entropyon Dictionary | Thesaurus |Spanish
Copyright © 2013 Dictionary.com, LLC. All rights reserved.
• Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT