Index: The Book of Statistical ProofsProbability DistributionsUnivariate continuous distributionsNormal distribution ▷ Differential entropy

Theorem: Let $X$ be a random variable following a normal distribution:

\[\label{eq:norm} X \sim \mathcal{N}(\mu, \sigma^2) \; .\]

Then, the differential entropy of $X$ is

\[\label{eq:norm-dent} \mathrm{h}(X) = \frac{1}{2} \ln\left( 2 \pi \sigma^2 e \right) \; .\]

Proof: The differential entropy of a random variable is defined as

\[\label{eq:dent} \mathrm{h}(X) = - \int_{\mathcal{X}} p(x) \, \log_b p(x) \, \mathrm{d}x \; .\]

To measure $h(X)$ in nats, we set $b = e$, such that

\[\label{eq:dent-nats} \mathrm{h}(X) = - \mathrm{E}\left[ \ln p(x) \right] \; .\]

With the probability density function of the normal distribution, the differential entropy of $X$ is:

\[\label{eq:norm-dent-s1} \begin{split} \mathrm{h}(X) &= - \mathrm{E}\left[ \ln \left( \frac{1}{\sqrt{2 \pi} \sigma} \cdot \exp \left[ -\frac{1}{2} \left( \frac{x-\mu}{\sigma} \right)^2 \right] \right) \right] \\ &= - \mathrm{E}\left[ - \frac{1}{2} \ln(2\pi\sigma^2) - \frac{1}{2} \left( \frac{x-\mu}{\sigma} \right)^2 \right] \\ &= \frac{1}{2} \ln(2 \pi \sigma^2) + \frac{1}{2} \, \mathrm{E}\left[ \left( \frac{x-\mu}{\sigma} \right)^2 \right] \\ &= \frac{1}{2} \ln(2 \pi \sigma^2) + \frac{1}{2} \cdot \frac{1}{\sigma^2} \cdot \mathrm{E}\left[ (x-\mu)^2 \right] \; . \end{split}\]

Note that $\mathrm{E}\left[ (x-\mu)^2 \right]$ corresponds to the variance of $X$ and the variance of the normal distribution is $\sigma^2$. Thus, we can proceed:

\[\label{eq:norm-dent-s2} \begin{split} \mathrm{h}(X) &= \frac{1}{2} \ln(2 \pi \sigma^2) + \frac{1}{2} \cdot \frac{1}{\sigma^2} \cdot \sigma^2 \\ &= \frac{1}{2} \ln(2 \pi \sigma^2) + \frac{1}{2} \\ &= \frac{1}{2} \ln(2 \pi \sigma^2 e) \; . \end{split}\]
Sources:

Metadata: ID: P101 | shortcut: norm-dent | author: JoramSoch | date: 2020-05-14, 20:09.