Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Shannon entropy ▷ Joint entropy

Definition: Let $X$ and $Y$ be discrete random variables with possible outcomes $\mathcal{X}$ and $\mathcal{Y}$ and joint probability mass function $p(x,y)$. Then, the joint entropy of $X$ and $Y$ is defined as

\[\label{eq:ent-joint} \mathrm{H}(X,Y) = - \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} p(x,y) \cdot \log_b p(x,y)\]

where $b$ is the base of the logarithm specifying in which unit the entropy is determined.

 
Sources:

Metadata: ID: D18 | shortcut: ent-joint | author: JoramSoch | date: 2020-02-19, 18:18.