Proof: Differential entropy of the multivariate normal distribution
Index:
The Book of Statistical Proofs ▷
Probability Distributions ▷
Multivariate continuous distributions ▷
Multivariate normal distribution ▷
Differential entropy
Metadata: ID: P100 | shortcut: mvn-dent | author: JoramSoch | date: 2020-05-14, 19:49.
Theorem: Let $x$ follow a multivariate normal distribution
\[\label{eq:mvn} x \sim \mathcal{N}(\mu, \Sigma) \; .\]Then, the differential entropy of $x$ in nats is
\[\label{eq:mvn-dent} \mathrm{h}(x) = \frac{n}{2} \ln(2\pi) + \frac{1}{2} \ln|\Sigma| + \frac{1}{2} n \; .\]Proof: The differential entropy of a random variable is defined as
\[\label{eq:dent} \mathrm{h}(X) = - \int_{\mathcal{X}} p(x) \, \log_b p(x) \, \mathrm{d}x \; .\]To measure $h(X)$ in nats, we set $b = e$, such that
\[\label{eq:dent-nats} \mathrm{h}(X) = - \mathrm{E}\left[ \ln p(x) \right] \; .\]With the probability density function of the multivariate normal distribution, the differential entropy of $x$ is:
\[\label{eq:mvn-dent-s1} \begin{split} \mathrm{h}(x) &= - \mathrm{E}\left[ \ln \left( \frac{1}{\sqrt{(2 \pi)^n |\Sigma|}} \cdot \exp \left[ -\frac{1}{2} (x-\mu)^\mathrm{T} \Sigma^{-1} (x-\mu) \right] \right) \right] \\ &= - \mathrm{E}\left[ - \frac{n}{2} \ln(2\pi) - \frac{1}{2} \ln|\Sigma| - \frac{1}{2} (x-\mu)^\mathrm{T} \Sigma^{-1} (x-\mu) \right] \\ &= \frac{n}{2} \ln(2\pi) + \frac{1}{2} \ln|\Sigma| + \frac{1}{2} \, \mathrm{E}\left[ (x-\mu)^\mathrm{T} \Sigma^{-1} (x-\mu) \right] \; . \end{split}\]The last term can be evaluted as
\[\label{eq:mvn-dent-t3} \begin{split} \mathrm{E}\left[ (x-\mu)^\mathrm{T} \Sigma^{-1} (x-\mu) \right] &= \mathrm{E}\left[ \mathrm{tr}\left( (x-\mu)^\mathrm{T} \Sigma^{-1} (x-\mu) \right) \right] \\ &= \mathrm{E}\left[ \mathrm{tr}\left( \Sigma^{-1} (x-\mu) (x-\mu)^\mathrm{T} \right) \right] \\ &= \mathrm{tr}\left( \Sigma^{-1} \mathrm{E}\left[ (x-\mu) (x-\mu)^\mathrm{T} \right] \right) \\ &= \mathrm{tr}\left( \Sigma^{-1} \Sigma \right) \\ &= \mathrm{tr}\left( I_n \right) \\ &= n \; , \\ \end{split}\]such that the differential entropy is
\[\label{eq:mvn-dent-qed} \mathrm{h}(x) = \frac{n}{2} \ln(2\pi) + \frac{1}{2} \ln|\Sigma| + \frac{1}{2} \, n \; .\]∎
Sources: - Kiuhnm (2018): "Entropy of the multivariate Gaussian"; in: StackExchange Mathematics, retrieved on 2020-05-14; URL: https://math.stackexchange.com/questions/2029707/entropy-of-the-multivariate-gaussian.
Metadata: ID: P100 | shortcut: mvn-dent | author: JoramSoch | date: 2020-05-14, 19:49.