Index: The Book of Statistical ProofsProbability Distributions ▷ Univariate discrete distributions ▷ Discrete uniform distribution ▷ Shannon entropy

Theorem: Let $X$ be a random variable following a discrete uniform distribution:

\[\label{eq:duni} X \sim \mathcal{U}(a,b) \; .\]

Then, the (Shannon) entropy of $X$ in nats is

\[\label{eq:duni-ent} \mathrm{H}(X) = \ln(b-a+1) \; .\]

Proof: The entropy is defined as the probability-weighted average of the logarithmized probabilities for all possible values:

\[\label{eq:ent} \mathrm{H}(X) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b p(x) \; .\]

Entropy is measured in nats by setting $b = e$. Then, with the probability mass function of the discrete uniform distribution, we have:

\[\label{eq:duni-ent-qed} \begin{split} \mathrm{H}(X) &= - \sum_{x \in \mathcal{X}} p(x) \cdot \log_e p(x) \\ &= - \sum_{x=a}^{b} p(x) \cdot \ln p(x) \\ &= - \sum_{x=a}^{b} \frac{1}{b-a+1} \cdot \ln{\frac{1}{b-a+1}} \\ &= - (b-a+1) \cdot \frac{1}{b-a+1} \cdot \ln{\frac{1}{b-a+1}} \\ &= - \ln{\frac{1}{b-a+1}} \\ &= \ln(b-a+1) \; . \end{split}\]
Sources:

Metadata: ID: P410 | shortcut: duni-ent | author: JoramSoch | date: 2023-08-11, 13:13.