Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Continuous mutual information ▷ Relation to joint and conditional differential entropy

Theorem: Let $X$ and $Y$ be continuous random variables with the joint probability $p(x,y)$ for $x \in \mathcal{X}$ and $y \in \mathcal{Y}$. Then, the mutual information of $X$ and $Y$ can be expressed as

$\label{eq:dmi-jce} \mathrm{I}(X,Y) = \mathrm{h}(X,Y) - \mathrm{h}(X|Y) - \mathrm{h}(Y|X)$

where $\mathrm{h}(X,Y)$ is the joint differential entropy of $X$ and $Y$ and $\mathrm{h}(X \vert Y)$ and $\mathrm{h}(Y \vert X)$ are the conditional differential entropies.

Proof: The existence of the joint probability density function ensures that the mutual information is defined:

$\label{eq:MI} \mathrm{I}(X,Y) = \int_{\mathcal{X}} \int_{\mathcal{Y}} p(x,y) \log \frac{p(x,y)}{p(x)\,p(y)} \, \mathrm{d}y \, \mathrm{d}x \; .$ $\label{eq:cmi-mcde1} \mathrm{I}(X,Y) = \mathrm{h}(X) - \mathrm{h}(X|Y)$ $\label{eq:cmi-mcde2} \mathrm{I}(X,Y) = \mathrm{h}(Y) - \mathrm{h}(Y|X)$ $\label{eq:cmi-mjde} \mathrm{I}(X,Y) = \mathrm{h}(X) + \mathrm{h}(Y) - \mathrm{h}(X,Y) \; .$

It is true that

$\label{eq:MI-s1} \mathrm{I}(X,Y) = \mathrm{I}(X,Y) + \mathrm{I}(X,Y) - \mathrm{I}(X,Y) \; .$

Plugging in \eqref{eq:cmi-mcde1}, \eqref{eq:cmi-mcde2} and \eqref{eq:cmi-mjde} on the right-hand side, we have

$\label{eq:MI-s2} \begin{split} \mathrm{I}(X,Y) &= \mathrm{h}(X) - \mathrm{h}(X|Y) + \mathrm{h}(Y) - \mathrm{h}(Y|X) - \mathrm{h}(X) - \mathrm{h}(Y) + \mathrm{h}(X,Y) \\ &= \mathrm{h}(X,Y) - \mathrm{h}(X|Y) - \mathrm{h}(Y|X) \end{split}$

which proves the identity given above.

Sources:

Metadata: ID: P60 | shortcut: cmi-jcde | author: JoramSoch | date: 2020-02-21, 17:23.