Index: The Book of Statistical ProofsProbability Distributions ▷ Univariate continuous distributions ▷ Gamma distribution ▷ Expectation of x ln x

Theorem: Let $X$ be a random variable following a gamma distribution:

$\label{eq:gam} X \sim \mathrm{Gam}(a, b) \; .$

Then, the mean or expected value of $(X \cdot \ln X)$ is

$\label{eq:gam-xlogx} \mathrm{E}(X \ln X) = \frac{a}{b} \left[ \psi(a) - \ln(b) \right] \; .$

Proof: With the definition of the expected value, the law of the unconscious statistician and the probability density function of the gamma distribution, we have:

$\label{eq:gam-xlogx-s1} \begin{split} \mathrm{E}(X \ln X) &= \int_{0}^{\infty} x \ln x \cdot \frac{b^a}{\Gamma(a)} x^{a-1} \exp[-b x] \, \mathrm{d}x \\ &= \frac{1}{\Gamma(a)} \int_{0}^{\infty} \ln x \cdot \frac{b^{a+1}}{b} x^{a} \exp[-b x] \, \mathrm{d}x \\ &= \frac{\Gamma(a+1)}{\Gamma(a) \, b} \int_{0}^{\infty} \ln x \cdot \frac{b^{a+1}}{\Gamma(a+1)} x^{(a+1)-1} \exp[-b x] \, \mathrm{d}x \\ \end{split}$

The integral now corresponds to the logarithmic expectation of a gamma distribution with shape $a+1$ and rate $b$

$\label{eq:logmean-a+1} \mathrm{E}(\ln Y) \quad \text{where} \quad Y \sim \mathrm{Gam}(a+1,b)$

which is given by

$\label{eq:gam-logmean} \mathrm{E}(\ln Y) = \psi(a+1) - \ln(b)$

where $\psi(x)$ is the digamma function. Additionally employing the relation

$\label{eq:gam-fct} \Gamma(x+1) = \Gamma(x) \cdot x \quad \Leftrightarrow \quad \frac{\Gamma(x+1)}{\Gamma(x)} = x \; ,$

the expression in equation \eqref{eq:gam-xlogx-s1} develops into:

$\label{eq:gam-xlogx-qed} \mathrm{E}(X \ln X) = \frac{a}{b} \left[ \psi(a) - \ln(b) \right] \; .$
Sources:

Metadata: ID: P179 | shortcut: gam-xlogx | author: JoramSoch | date: 2020-10-15, 13:02.