Entropy and multinomial coefficients
Multinomial coefficients have exponential growth rates governed by entropy. This connection underlies many results in information theory and large deviations, including the method-of-types .
Multinomial coefficient and empirical proportions
Let be nonnegative integers with . The multinomial coefficient is
where is the factorial.
Define proportions (so ). The (natural-log) Shannon entropy is
(Using base-2 logarithms multiplies all entropy expressions by .)
Main asymptotic principle
Using stirlings-approximation , one obtains
as long as the number of categories is fixed (and typically assuming each is bounded away from to avoid singular prefactors).
A more detailed approximation (when all ) is
where “” means the ratio tends to under standard regularity conditions.
Simple entropy bounds (polynomial accuracy)
There are widely used bounds that capture the correct exponential rate without tracking constants:
These bounds are especially convenient when only exponential rates matter (e.g., in large deviation estimates).
Proof idea for the upper bound
Apply the multinomial theorem with weights :
Every term is nonnegative, so for the particular ,
hence
Why this matters
- The exponential size of a “type class” (all sequences with the same empirical proportions) is controlled by , up to polynomial factors; this is the core counting step in the method-of-types .
- Many probability bounds reduce to comparing with likelihood terms, producing Kullback–Leibler divergence rates.