Thermodynamic entropy
Thermodynamic entropy is a state function defined on equilibrium states . It is introduced by the Clausius definition: for any reversible process ,
where is the heat increment along the reversible path and is thermodynamic temperature .
A key fact (often called the Clausius theorem) is that for reversible cycles,
so the integral between two equilibrium states is path-independent when taken over reversible paths. This path-independence is what makes a state function even though itself is an inexact differential.
Physical interpretation
Entropy measures the directionality of macroscopic change: it increases when constraints are relaxed and when processes are irreversible. In many contexts, it is useful to think of entropy as quantifying “energy dispersal” among microscopic degrees of freedom, and of entropy production as a measure of irreversibility.
Key properties and relations
Second law and isolated systems: for an isolated system (no heat or work exchange), the second law implies
with equality for an idealized reversible evolution.
Clausius inequality: for any cycle (not necessarily reversible),
with equality iff the cycle is reversible. This is the content of the Clausius inequality in a compact form.
Entropy and temperature (definition of T): if equilibrium states are described by a fundamental relation in the entropy representation , then temperature is defined by
using the notion of partial derivative . The reciprocal temperature is often written as β (inverse temperature) after introducing Boltzmann’s constant .
Additivity and extensivity: for weakly interacting subsystems, entropy is additive, aligning with the additivity postulate . In the thermodynamic limit , entropy typically scales with system size, making it an extensive variable (see the extensivity postulate ).
Stability (concavity): for stable equilibrium, is concave in its extensive arguments; this is one formulation of entropy concavity and stability and is part of thermodynamic stability .
Statistical-mechanical connection: in equilibrium statistical mechanics, entropy can be related to microstate multiplicity (e.g., in the microcanonical setting) and, more generally, to the information-theoretic Shannon entropy of a probability distribution over microstates, with setting the physical units.
Third law (reference point): the third law constrains the behavior of as (often implying approaches a constant), fixing the otherwise arbitrary additive constant in practical conventions (see entropy normalization conventions ).