Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Discrete mutual information ▷ Definition

Definition:

1) The mutual information of two discrete random variables $X$ and $Y$ is defined as

\[\label{eq:mi-disc} \mathrm{I}(X,Y) = - \sum_{x \in \mathcal{X}} \sum_{x \in \mathcal{Y}} p(x,y) \cdot \log \frac{p(x,y)}{p(x) \cdot p(y)}\]

where $p(x)$ and $p(y)$ are the probability mass functions of $X$ and $Y$ and $p(x,y)$ is the joint probability mass function of $X$ and $Y$.

2) The mutual information of two continuous random variables $X$ and $Y$ is defined as

\[\label{eq:mi-cont} \mathrm{I}(X,Y) = - \int_{\mathcal{X}} \int_{\mathcal{Y}} p(x,y) \cdot \log \frac{p(x,y)}{p(x) \cdot p(y)} \, \mathrm{d}y \, \mathrm{d}x\]

where $p(x)$ and $p(y)$ are the probability density functions of $X$ and $Y$ and $p(x,y)$ is the joint probability density function of $X$ and $Y$.

 
Sources:

Metadata: ID: D19 | shortcut: mi | author: JoramSoch | date: 2020-02-19, 18:35.