Definition: Differential cross-entropy
Index: The Book of Statistical Proofs ▷ General Theorems ▷ Information theory ▷ Differential entropy ▷ Differential cross-entropy
Sources:
Metadata: ID: D86 | shortcut: dent-cross | author: JoramSoch | date: 2020-07-28, 03:03.
Definition: Let $X$ be a continuous random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$ with the probability density functions $p(x)$ and $q(x)$. Then, the differential cross-entropy of $Q$ relative to $P$ is defined as
\[\label{eq:dent-cross} \mathrm{h}(P,Q) = - \int_{\mathcal{X}} p(x) \log_b q(x) \, \mathrm{d}x\]where $b$ is the base of the logarithm specifying in which unit the differential cross-entropy is determined.
- Wikipedia (2020): "Cross entropy"; in: Wikipedia, the free encyclopedia, retrieved on 2020-07-28; URL: https://en.wikipedia.org/wiki/Cross_entropy#Definition.
Metadata: ID: D86 | shortcut: dent-cross | author: JoramSoch | date: 2020-07-28, 03:03.