Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Differential entropy ▷ Negativity

Theorem: Unlike its discrete analogue, the differential entropy can become negative.

Proof: Let $X$ be a random variable following a continuous uniform distribution with minimum $0$ and maximum $1/2$:

\[\label{eq:X} X \sim \mathcal{U}(0, 1/2) \; .\]

Then, its probability density function is:

\[\label{eq:X-pdf} f_X(x) = 2 \quad \text{for} \quad 0 \leq x \leq \frac{1}{2} \; .\]

Thus, the differential entropy follows as

\[\label{eq:X-dent} \begin{split} \mathrm{h}(X) &= - \int_{\mathcal{X}} f_X(x) \log_b f_X(x) \, \mathrm{d}x \\ &= - \int_{0}^{\frac{1}{2}} 2 \, \log_b(2) \, \mathrm{d}x \\ &= -\log_b(2) \int_{0}^{\frac{1}{2}} 2 \, \mathrm{d}x \\ &= -\log_b(2) \left[ 2x \right]_{0}^{\frac{1}{2}} \\ &= -\log_b(2) \end{split}\]

which is negative for any base $b > 1$.

Sources:

Metadata: ID: P68 | shortcut: dent-neg | author: JoramSoch | date: 2020-03-02, 20:32.