Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Shannon entropy ▷ Definition

Definition: Let $X$ be a discrete random variable with possible outcomes $\mathcal{X}$ and the (observed or assumed) probability mass function $p(x) = f_X(x)$. Then, the entropy (also referred to as “Shannon entropy”) of $X$ is defined as

\[\label{eq:ent} \mathrm{H}(X) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b p(x)\]

where $b$ is the base of the logarithm specifying in which unit the entropy is determined. By convention, $0 \cdot \log 0$ is taken to be zero when calculating the entropy of $X$.

 
Sources:

Metadata: ID: D15 | shortcut: ent | author: JoramSoch | date: 2020-02-19, 17:36.