Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Shannon entropy ▷ Cross-entropy

Definition: Let $X$ be a discrete random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$ with the probability mass functions $p(x)$ and $q(x)$. Then, the cross-entropy of $Q$ relative to $P$ is defined as

\[\label{eq:ent-cross} \mathrm{H}(P,Q) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b q(x)\]

where $b$ is the base of the logarithm specifying in which unit the cross-entropy is determined.

 
Sources:

Metadata: ID: D85 | shortcut: ent-cross | author: JoramSoch | date: 2020-07-28, 02:51.