Index: The Book of Statistical ProofsGeneral Theorems ▷ Information theory ▷ Differential entropy ▷ Non-invariance and transformation

Theorem: The differential entropy is not invariant under change of variables, i.e. there exist random variables $X$ and $Y = g(X)$, such that

\[\label{eq:dent-noninv} \mathrm{h}(Y) \neq \mathrm{h}(X) \; .\]

In particular, for an invertible transformation $g: X \rightarrow Y$ from a random vector $X$ to another random vector of the same dimension $Y$, it holds that

\[\label{eq:dent-trans} \mathrm{h}(Y) = \mathrm{h}(X) + \int_{\mathcal{X}} f_X(x) \log \left| J_g(x) \right| \, \mathrm{d}x \; .\]

where $J_g(x)$ is the Jacobian matrix of the vector-valued function $g$ and $\mathcal{X}$ is the set of possible values of $X$.

Proof: By definition, the differential entropy of $X$ is

\[\label{eq:X-dent} \mathrm{h}(X) = - \int_{\mathcal{X}} f_X(x) \log f_X(x) \, \mathrm{d}x\]

where $f_X(x)$ is the probability density function of $X$.

The probability density function of an invertible function of a continuous random vector $Y = g(X)$ is

\[\label{eq:pdf-invfct} f_Y(y) = \left\{ \begin{array}{rl} f_X(g^{-1}(y)) \, \left| J_{g^{-1}}(y) \right| \; , & \text{if} \; y \in \mathcal{Y} \\ 0 \; , & \text{if} \; y \notin \mathcal{Y} \end{array} \right.\]

where $\mathcal{Y} = \left\lbrace y = g(x): x \in \mathcal{X} \right\rbrace$ is the set of possible outcomes of $Y$ and $J_{g^{-1}}(y)$ is the Jacobian matrix of $g^{-1}(y)$

\[\label{eq:jac} J_{g^{-1}}(y) = \left[ \begin{matrix} \frac{\mathrm{d}x_1}{\mathrm{d}y_1} & \ldots & \frac{\mathrm{d}x_1}{\mathrm{d}y_n} \\ \vdots & \ddots & \vdots \\ \frac{\mathrm{d}x_n}{\mathrm{d}y_1} & \ldots & \frac{\mathrm{d}x_n}{\mathrm{d}y_n} \end{matrix} \right] \; .\]

Thus, the differential entropy of $Y$ is

\[\label{eq:Y-dent-s1} \begin{split} \mathrm{h}(Y) &\overset{\eqref{eq:X-dent}}{=} - \int_{\mathcal{Y}} f_Y(y) \log f_Y(y) \, \mathrm{d}y \\ &\overset{\eqref{eq:pdf-invfct}}{=} - \int_{\mathcal{Y}} \left[ f_X(g^{-1}(y)) \, \left| J_{g^{-1}}(y) \right| \right] \log \left[ f_X(g^{-1}(y)) \, \left| J_{g^{-1}}(y) \right| \right] \, \mathrm{d}y \; . \end{split}\]

Substituting $y = g(x)$ into the integral and applying $J_{f^{-1}}(y) = J_f^{-1}(x)$, we obtain

\[\label{eq:Y-dent-s2} \begin{split} \mathrm{h}(Y) &= - \int_{\mathcal{X}} \left[ f_X(g^{-1}(g(x))) \, \left| J_{g^{-1}}(y) \right| \right] \log \left[ f_X(g^{-1}(g(x))) \, \left| J_{g^{-1}}(y) \right| \right] \, \mathrm{d}[g(x)] \\ &= - \int_{\mathcal{X}} \left[ f_X(x) \, \left| J_g^{-1}(x) \right| \right] \log \left[ f_X(x) \, \left| J_g^{-1}(x) \right| \right] \, \mathrm{d}[g(x)] \; . \end{split}\]

Using the relations $y = f(x) \Rightarrow \mathrm{d}y = \lvert J_f(x) \rvert \, \mathrm{d}x$ and $\lvert A \rvert \lvert B \rvert = \lvert AB \rvert$, this becomes

\[\label{eq:Y-dent-s3} \begin{split} \mathrm{h}(Y) &= - \int_{\mathcal{X}} \left[ f_X(x) \, \left| J_g^{-1}(x) \right| \left| J_g(x) \right| \right] \log \left[ f_X(x) \, \left| J_g^{-1}(x) \right| \right] \, \mathrm{d}x \\ &= - \int_{\mathcal{X}} f_X(x) \log f_X(x) \, \mathrm{d}x - \int_{\mathcal{X}} f_X(x) \log \left| J_g^{-1}(x) \right| \, \mathrm{d}x \; . \end{split}\]

Finally, employing the fact that $\int_{\mathcal{X}} f_X(x) \, \mathrm{d}x = 1$ and the determinant property $\lvert A^{-1} \rvert = 1/\lvert A \rvert$, we can derive the differential entropy of $Y$ as

\[\label{eq:Y-dent-s4} \begin{split} \mathrm{h}(Y) &= - \int_{\mathcal{X}} f_X(x) \log f_X(x) \, \mathrm{d}x - \int_{\mathcal{X}} f_X(x) \log \frac{1}{\left| J_g(x) \right|} \, \mathrm{d}x \\ &\overset{\eqref{eq:X-dent}}{=} \mathrm{h}(X) + \int_{\mathcal{X}} f_X(x) \log \left| J_g(x) \right| \, \mathrm{d}x \; . \end{split}\]

Because there exist $X$ and $Y$, such that the integral term in \eqref{eq:Y-dent-s4} is non-zero, this also demonstrates that there exist $X$ and $Y$, such that \eqref{eq:dent-noninv} is fulfilled.


Metadata: ID: P262 | shortcut: dent-noninv | author: JoramSoch | date: 2021-10-07, 10:39.