Definition: Shannon entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Shannon entropy ▷
Definition
Sources:
Metadata: ID: D15 | shortcut: ent | author: JoramSoch | date: 2020-02-19, 17:36.
Definition: Let $X$ be a discrete random variable with possible outcomes $\mathcal{X}$ and the (observed or assumed) probability mass function $p(x) = f_X(x)$. Then, the entropy (also referred to as “Shannon entropy”) of $X$ is defined as
\[\label{eq:ent} \mathrm{H}(X) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b p(x)\]where $b$ is the base of the logarithm specifying in which unit the entropy is determined. By convention, $0 \cdot \log 0$ is taken to be zero when calculating the entropy of $X$.
- Shannon CE (1948): "A Mathematical Theory of Communication"; in: Bell System Technical Journal, vol. 27, iss. 3, pp. 379-423; URL: https://ieeexplore.ieee.org/document/6773024; DOI: 10.1002/j.1538-7305.1948.tb01338.x.
Metadata: ID: D15 | shortcut: ent | author: JoramSoch | date: 2020-02-19, 17:36.