Pinsker's inequality
An inequality bounding total variation distance by the square root of Kullback–Leibler divergence.
Pinsker’s inequality
Pinsker’s inequality: Let and be probability measures on the same . Then their total variation distance satisfies
where is the Kullback–Leibler divergence computed with natural logarithms (if another log base is used, the constant changes accordingly). The inequality is understood to hold trivially when .
Pinsker’s inequality formalizes that small relative entropy forces two laws to be close in a strong, event-wise sense. Together with Gibbs' inequality , it highlights KL divergence as a nonnegative discrepancy measure that quantitatively controls .