In probability theory
or information theory
, the min-entropy
of a discrete random event x
with possible states (or outcomes) 1... n
and corresponding probabilities p1
The base of the logarithm is just a scaling constant; for a result in bits, use a base-2 logarithm. Thus, a
distribution has a min-entropy of at least b bits if no possible state has a probability greater than 2-b.
The min-entropy is always less than or equal to the Shannon entropy; it is equal when all the probabilities pi are equal.
min-entropy is important in the theory of randomness extractors.
The notation derives from a parameterized family of Shannon-like entropy measures, Rényi entropy,
=1 is Shannon entropy. As k
is increased, more weight is given to the larger probabilities, and in the limit as k
→∞, only the largest p_i has any effect on the result.