Definition: Kullback-Leibler divergence
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Kullback-Leibler divergence ▷
Definition
Sources:
Metadata: ID: D52 | shortcut: kl | author: JoramSoch | date: 2020-05-10, 20:20.
Definition: Let $X$ be a random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$.
1) The Kullback-Leibler divergence of $P$ from $Q$ for a discrete random variable $X$ is defined as
\[\label{eq:KL-disc} \mathrm{KL}[P||Q] = \sum_{x \in \mathcal{X}} p(x) \cdot \log \frac{p(x)}{q(x)}\]where $p(x)$ and $q(x)$ are the probability mass functions of $P$ and $Q$.
2) The Kullback-Leibler divergence of $P$ from $Q$ for a continuous random variable $X$ is defined as
\[\label{eq:KL-cont} \mathrm{KL}[P||Q] = \int_{\mathcal{X}} p(x) \cdot \log \frac{p(x)}{q(x)} \, \mathrm{d}x\]where $p(x)$ and $q(x)$ are the probability density functions of $P$ and $Q$.
By convention, $0 \cdot \log 0$ is taken to be zero when calculating the divergence between $P$ and $Q$.
- MacKay, David J.C. (2003): "Probability, Entropy, and Inference"; in: Information Theory, Inference, and Learning Algorithms, ch. 2.6, eq. 2.45, p. 34; URL: https://www.inference.org.uk/itprnn/book.pdf.
Metadata: ID: D52 | shortcut: kl | author: JoramSoch | date: 2020-05-10, 20:20.