Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Differential entropy ▷ Differential cross-entropy

Definition: Let $X$ be a continuous random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$ with the probability density functions $p(x)$ and $q(x)$. Then, the differential cross-entropy of $Q$ relative to $P$ is defined as

\[\label{eq:dent-cross} \mathrm{h}(P,Q) = - \int_{\mathcal{X}} p(x) \log_b q(x) \, \mathrm{d}x\]

where $b$ is the base of the logarithm specifying in which unit the differential cross-entropy is determined.

 
Sources:

Metadata: ID: D86 | shortcut: dent-cross | author: JoramSoch | date: 2020-07-28, 03:03.