Proof: Entropy of the Bernoulli distribution
Index:
The Book of Statistical Proofs ▷
Probability Distributions ▷
Univariate discrete distributions ▷
Bernoulli distribution ▷
Shannon entropy
Metadata: ID: P334 | shortcut: bern-ent | author: JoramSoch | date: 2022-09-02, 12:21.
Theorem: Let $X$ be a random variable following a Bernoulli distribution:
\[\label{eq:bern} X \sim \mathrm{Bern}(p) \; .\]Then, the (Shannon) entropy of $X$ in bits is
\[\label{eq:bern-ent} \mathrm{H}(X) = -p \log_2 p - (1-p) \log_2 (1-p) \; .\]Proof: The entropy is defined as the probability-weighted average of the logarithmized probabilities for all possible values:
\[\label{eq:ent} \mathrm{H}(X) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b p(x) \; .\]Entropy is measured in bits by setting $b = 2$. Since there are only two possible outcomes for a Bernoulli random variable, we have:
\[\label{eq:bern-ent-qed} \begin{split} \mathrm{H}(X) &= - \mathrm{Pr}(X = 0) \cdot \log_2 \mathrm{Pr}(X = 0) - \mathrm{Pr}(X = 1) \cdot \log_2 \mathrm{Pr}(X = 1) \\ &= -p \log_2 p - (1-p) \log_2 (1-p) \; . \\ \end{split}\]∎
Sources: - Wikipedia (2022): "Bernoulli distribution"; in: Wikipedia, the free encyclopedia, retrieved on 2022-09-02; URL: https://en.wikipedia.org/wiki/Bernoulli_distribution.
- Wikipedia (2022): "Binary entropy function"; in: Wikipedia, the free encyclopedia, retrieved on 2022-09-02; URL: https://en.wikipedia.org/wiki/Binary_entropy_function.
Metadata: ID: P334 | shortcut: bern-ent | author: JoramSoch | date: 2022-09-02, 12:21.