Differential entropy
The entropy of a continuous distribution defined via an integral of the log-density.
Differential entropy
A differential entropy is a number associated to a real-valued random variable that admits a density (with respect to Lebesgue measure ), defined by
where the integral is understood as a Lebesgue integral (and is typically the natural logarithm).
Differential entropy resembles Shannon entropy but behaves differently: it can be negative and it is not invariant under changes of variables (for instance, scaling shifts by an additive constant). It is used in continuous-information settings and in the maximum entropy principle for continuous distributions.
Examples:
- If , then .
- If is uniform on , then .