Definition: Conditional entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Shannon entropy ▷
Conditional entropy
Sources:
Metadata: ID: D17 | shortcut: ent-cond | author: JoramSoch | date: 2020-02-19, 18:08.
Definition: Let $X$ and $Y$ be discrete random variables with possible outcomes $\mathcal{X}$ and $\mathcal{Y}$ and probability mass functions $p(x)$ and $p(y)$. Then, the conditional entropy of $Y$ given $X$ or, entropy of $Y$ conditioned on $X$, is defined as
\[\label{eq:ent-cond} \mathrm{H}(Y|X) = \sum_{x \in \mathcal{X}} p(x) \cdot \mathrm{H}(Y|X=x)\]where $\mathrm{H}(Y \vert X=x)$ is the (marginal) entropy of $Y$, evaluated at $x$.
- Cover TM, Thomas JA (1991): "Joint Entropy and Conditional Entropy"; in: Elements of Information Theory, ch. 2.2, p. 15; URL: https://www.wiley.com/en-us/Elements+of+Information+Theory%2C+2nd+Edition-p-9780471241959.
Metadata: ID: D17 | shortcut: ent-cond | author: JoramSoch | date: 2020-02-19, 18:08.