Proof: Relation of continuous Kullback-Leibler divergence to differential entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Kullback-Leibler divergence ▷
Relation to differential entropy
Metadata: ID: P114 | shortcut: kl-dent | author: JoramSoch | date: 2020-05-27, 23:32.
Theorem: Let $X$ be a continuous random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$. Then, the Kullback-Leibler divergence of $P$ from $Q$ can be expressed as
\[\label{eq:kl-dent} \mathrm{KL}[P||Q] = \mathrm{h}(P,Q) - \mathrm{h}(P)\]where $\mathrm{h}(P,Q)$ is the differential cross-entropy of $P$ and $Q$ and $\mathrm{h}(P)$ is the marginal differential entropy of $P$.
Proof: The continuous Kullback-Leibler divergence is defined as
\[\label{eq:KL} \mathrm{KL}[P||Q] = \int_{\mathcal{X}} p(x) \cdot \log \frac{p(x)}{q(x)} \, \mathrm{d}x\]where $p(x)$ and $q(x)$ are the probability density functions of $P$ and $Q$.
Separating the logarithm, we have:
\[\label{eq:KL-dev} \mathrm{KL}[P||Q] = - \int_{\mathcal{X}} p(x) \, \log q(x) \, \mathrm{d}x + \int_{\mathcal{X}} p(x) \, \log p(x) \, \mathrm{d}x \; .\]Now considering the definitions of marginal differential entropy and differential cross-entropy
\[\label{eq:MDE-DCE} \begin{split} \mathrm{h}(P) &= - \int_{\mathcal{X}} p(x) \, \log p(x) \, \mathrm{d}x \\ \mathrm{h}(P,Q) &= - \int_{\mathcal{X}} p(x) \, \log q(x) \, \mathrm{d}x \; , \end{split}\]we can finally show:
\[\label{eq:KL-qed} \mathrm{KL}[P||Q] = \mathrm{h}(P,Q) - \mathrm{h}(P) \; .\]∎
Sources: - Wikipedia (2020): "Kullback-Leibler divergence"; in: Wikipedia, the free encyclopedia, retrieved on 2020-05-27; URL: https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence#Motivation.
Metadata: ID: P114 | shortcut: kl-dent | author: JoramSoch | date: 2020-05-27, 23:32.