Shannon entropy
A measure of uncertainty of a discrete random variable, defined from its probability mass function.
Shannon entropy
A Shannon entropy is a number associated to a discrete random variable with probability mass function , defined by
with the convention . (Unless stated otherwise, denotes the natural logarithm; changing the base rescales by a constant factor.)
Equivalently, is the expectation of under the distribution of . Shannon entropy is closely related to relative entropy (KL divergence) and is a central quantity in information theory.
Examples:
- If , then .
- If is uniform on , then .