Second law of thermodynamics
Definition (entropy formulation)
For every equilibrium thermodynamic state of a thermodynamic system , the second law asserts the existence of a state function called the thermodynamic entropy such that, for any thermodynamic process taking the system from equilibrium state to equilibrium state ,
Here is the (inexact) heat absorbed by the system (see heat ), and is the temperature at the boundary where that heat is exchanged (often set by a thermal reservoir ).
Equality holds if and only if the process is reversible ; strict inequality signals irreversibility .
A particularly transparent form is obtained by treating “system + surroundings ” as an isolated system : then the second law becomes
so the total entropy cannot decrease.
Physical interpretation
The second law is the thermodynamic expression of an “arrow of time”: macroscopic processes have a preferred direction because irreversibility produces entropy. Energy conservation alone (the first law ) does not forbid processes that run “backwards,” but the second law does.
Operationally, the law is equivalent to the impossibility of certain cyclic devices, captured by the Kelvin–Planck statement and the Clausius statement . The mathematical backbone connecting these formulations is the Clausius inequality .
Key relations and consequences
Entropy balance / entropy production. Any process can be written as
where is entropy generated by irreversibility (and characterizes reversibility).
Isolated and adiabatic implications. For an isolated system , . For a closed system insulated by an adiabatic wall (so ), one likewise has .
Free-energy monotonicity (common equilibrium criteria). For a closed system in contact with a thermal reservoir at fixed , the Helmholtz free energy (built from internal energy and entropy) satisfies for spontaneous changes at fixed . At fixed , the Gibbs free energy (built from enthalpy and entropy) satisfies .
Statistical-mechanical viewpoint. Many microscopic derivations connect “entropy never decreases” to the nonnegativity of information measures such as relative entropy (KL divergence) (see also Gibbs inequality ) and to the identification of with an entropy-like quantity (compare Shannon entropy ).