Boltzmann entropy

The entropy defined as the logarithm of the number of accessible microstates.
Boltzmann entropy

The Boltzmann entropy of a is

S=kBlnΩ S = k_B \ln \Omega

where Ω\Omega is the number of compatible with the macroscopic constraints, and kBk_B is Boltzmann’s constant.

Interpretation

Entropy measures the logarithm of the phase space volume (or state count) accessible to the system. Higher entropy means more microscopic configurations are compatible with the observed macroscopic state.

Properties

  • Extensivity: For independent subsystems, Ωtot=Ω1Ω2\Omega_{\text{tot}} = \Omega_1 \Omega_2, so Stot=S1+S2S_{\text{tot}} = S_1 + S_2.
  • Non-negativity: S0S \geq 0 (since Ω1\Omega \geq 1).
  • Maximum at equilibrium: Isolated systems evolve toward states of maximum entropy.

Relation to Gibbs entropy

For the with uniform distribution, the Gibbs entropy S=kBipilnpiS = -k_B \sum_i p_i \ln p_i reduces to the Boltzmann form.

Historical note

Boltzmann’s formula S=klogWS = k \log W is inscribed on his tombstone in Vienna.