Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Differential entropy ▷ Conditional differential entropy

Definition: Let $X$ and $Y$ be continuous random variables with possible outcomes $\mathcal{X}$ and $\mathcal{Y}$ and probability density functions $p(x)$ and $p(y)$. Then, the conditional differential entropy of $Y$ given $X$ or, differential entropy of $Y$ conditioned on $X$, is defined as

\[\label{eq:dent-cond} \mathrm{h}(Y|X) = \int_{x \in \mathcal{X}} p(x) \cdot \mathrm{h}(Y|X=x) \, \mathrm{d}x\]

where $\mathrm{h}(Y \vert X=x)$ is the (marginal) differential entropy of $Y$, evaluated at $x$.

 
Sources:

Metadata: ID: D34 | shortcut: dent-cond | author: JoramSoch | date: 2020-03-21, 12:27.