Definition: Cross-entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Shannon entropy ▷
Cross-entropy
Sources:
Metadata: ID: D85 | shortcut: ent-cross | author: JoramSoch | date: 2020-07-28, 02:51.
Definition: Let $X$ be a discrete random variable with possible outcomes $\mathcal{X}$ and let $P$ and $Q$ be two probability distributions on $X$ with the probability mass functions $p(x)$ and $q(x)$. Then, the cross-entropy of $Q$ relative to $P$ is defined as
\[\label{eq:ent-cross} \mathrm{H}(P,Q) = - \sum_{x \in \mathcal{X}} p(x) \cdot \log_b q(x)\]where $b$ is the base of the logarithm specifying in which unit the cross-entropy is determined.
- Wikipedia (2020): "Cross entropy"; in: Wikipedia, the free encyclopedia, retrieved on 2020-07-28; URL: https://en.wikipedia.org/wiki/Cross_entropy#Definition.
Metadata: ID: D85 | shortcut: ent-cross | author: JoramSoch | date: 2020-07-28, 02:51.