Entropy normalization convention
Definition and what “normalization” fixes
The thermodynamic entropy is a state function determined (macroscopically) by the second law up to choices that do not affect measurable entropy changes. An “entropy normalization” convention fixes:
- Units / scale factor (whether is dimensional or scaled by a constant), and
- Additive constant (the reference point for “absolute” entropy).
Convention used in this blog
Logarithm choice: we use the natural logarithm as stated in the logarithm convention . Changing the log base rescales entropy by a constant factor.
Boltzmann constant explicit by default: entropy is treated as a physical quantity with units of energy per temperature, keeping $k_B$ explicit unless natural units are declared.
Statistical-mechanical normalization (discrete states): for microstates with probabilities , we take the Gibbs/Shannon-form normalization
The dimensionless quantity is the Shannon entropy (in nats under our log choice), and the thermodynamic entropy is obtained by multiplying by .
In the microcanonical special case ( on accessible states), this reduces to the Boltzmann form .
Absolute reference (third law): to fix the additive constant, we adopt the third-law convention: for a perfect crystal with a nondegenerate ground state, as . More generally, a residual ground-state degeneracy corresponds to .
Key implications and useful checks
Only differences matter in most thermodynamics: statements like the Clausius inequality constrain and do not depend on the absolute constant.
Dimensionless entropy in natural units: if $k_B=1$ units are used, becomes dimensionless and temperature has energy units; the inverse temperature is then simply .
Connection to relative entropy: with this normalization, entropy differences and free-energy inequalities can often be expressed using Kullback–Leibler divergence (a dimensionless measure), with factors of restoring thermodynamic units.