Differential entropy

The entropy of a continuous distribution defined via an integral of the log-density.
Differential entropy

A differential entropy is a number h(X)h(X) associated to a real-valued XX that admits a density ff (with respect to ), defined by

h(X)  =  Rf(x)logf(x)dx, h(X)\;=\;-\int_{\mathbb{R}} f(x)\,\log f(x)\,dx,

where the integral is understood as a (and log\log is typically the natural logarithm).

Differential entropy resembles but behaves differently: it can be negative and it is not invariant under changes of variables (for instance, scaling XX shifts h(X)h(X) by an additive constant). It is used in continuous-information settings and in the for continuous distributions.

Examples:

  • If XN(μ,σ2)X\sim \mathcal{N}(\mu,\sigma^2), then h(X)=12log(2πeσ2)h(X)=\tfrac12\log(2\pi e\,\sigma^2).
  • If XX is uniform on [0,1][0,1], then h(X)=0h(X)=0.