Chebyshev's inequality

Upper bound on deviation probability using variance.
Chebyshev’s inequality

Chebyshev’s inequality: Let XX be a with μ=E[X]\mu=\mathbb{E}[X] and finite σ2=Var(X)\sigma^2=\mathrm{Var}(X). Then for every t>0t>0,

P(Xμt)σ2t2. \mathbb{P}\bigl(|X-\mu|\ge t\bigr)\le \frac{\sigma^2}{t^2}.

Equivalently, for every k>0k>0,

P(Xμkσ)1k2. \mathbb{P}\bigl(|X-\mu|\ge k\sigma\bigr)\le \frac{1}{k^2}.

Here P\mathbb{P} denotes the on the underlying . Chebyshev’s inequality is a direct consequence of applied to (Xμ)2(X-\mu)^2, and it is a standard tool for proving the .