Saddle-Point (Laplace) Method in One Dimension

Refined asymptotics for integrals of exp(n f(x)) near a nondegenerate maximizer, giving Gaussian prefactors beyond the Laplace principle.
Saddle-Point (Laplace) Method in One Dimension

This lemma refines the by identifying the leading prefactor when the maximum is attained at a nondegenerate interior point.

Statement (one-dimensional nondegenerate maximum)

Let f:[a,b]Rf:[a,b]\to\mathbb{R} be twice continuously differentiable and suppose:

  • ff has a unique global maximum at an interior point x0(a,b)x_0\in(a,b),
  • f(x0)<0f''(x_0)<0 (nondegenerate maximum),
  • g:[a,b]Rg:[a,b]\to\mathbb{R} is continuous with g(x0)0g(x_0)\neq 0.

Define

In=abenf(x)g(x)dx. I_n = \int_a^b e^{n f(x)}\, g(x)\,dx.

Then, as nn\to\infty,

In=enf(x0)g(x0)2πnf(x0)(1+o(1)). I_n ={} e^{n f(x_0)}\, g(x_0)\, \sqrt{\frac{2\pi}{n\,|f''(x_0)|}}\, \big(1+o(1)\big).

Under higher smoothness assumptions on ff and gg, one can obtain a full asymptotic expansion in powers of 1/n1/n.

Key hypotheses and conclusions

Hypotheses

  • A unique interior maximizer x0x_0 for ff.
  • Nondegeneracy: f(x0)<0f''(x_0)<0.
  • Mild regularity of gg and ff near x0x_0.

Conclusions

  • The integral is asymptotically Gaussian around x0x_0 after Taylor expansion of ff: the leading exponential growth is enf(x0)e^{n f(x_0)}, and the subexponential prefactor is of order n1/2n^{-1/2}.
  • This yields a quantitative refinement of the Laplace principle: 1nlogInf(x0)\frac1n\log I_n \to f(x_0).

In statistical mechanics, saddle-point estimates justify mean-field and variational approximations of partition functions (for example, after introducing an order parameter or via Hubbard–Stratonovich transforms) and they often explain the appearance of Gaussian fluctuations around equilibrium points.