Definition: Mutual information
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Discrete mutual information ▷
Definition
Sources:
Metadata: ID: D19 | shortcut: mi | author: JoramSoch | date: 2020-02-19, 18:35.
Definition:
1) The mutual information of two discrete random variables $X$ and $Y$ is defined as
\[\label{eq:mi-disc} \mathrm{I}(X,Y) = - \sum_{x \in \mathcal{X}} \sum_{x \in \mathcal{Y}} p(x,y) \cdot \log \frac{p(x,y)}{p(x) \cdot p(y)}\]where $p(x)$ and $p(y)$ are the probability mass functions of $X$ and $Y$ and $p(x,y)$ is the joint probability mass function of $X$ and $Y$.
2) The mutual information of two continuous random variables $X$ and $Y$ is defined as
\[\label{eq:mi-cont} \mathrm{I}(X,Y) = - \int_{\mathcal{X}} \int_{\mathcal{Y}} p(x,y) \cdot \log \frac{p(x,y)}{p(x) \cdot p(y)} \, \mathrm{d}y \, \mathrm{d}x\]where $p(x)$ and $p(y)$ are the probability density functions of $X$ and $Y$ and $p(x,y)$ is the joint probability density function of $X$ and $Y$.
- Cover TM, Thomas JA (1991): "Relative Entropy and Mutual Information"; in: Elements of Information Theory, ch. 2.3/8.5, p. 20/251; URL: https://www.wiley.com/en-us/Elements+of+Information+Theory%2C+2nd+Edition-p-9780471241959.
Metadata: ID: D19 | shortcut: mi | author: JoramSoch | date: 2020-02-19, 18:35.