Index: The Book of Statistical ProofsProbability Distributions ▷ Univariate continuous distributions ▷ Chi-squared distribution ▷ Moments

Theorem: Let $X$ be a random variable following a chi-squared distribution with $k$ degrees of freedom:

\[\label{eq:chi2} X \sim \chi^{2}(k) \; .\]

Then, if $m > -k/2$, the moment $\mathrm{E}(X^{m})$ exists and is equal to:

\[\label{eq:chi2-mom} \mathrm{E}(X^{m}) = 2^m \frac{\Gamma\left( \frac{k}{2}+m \right)}{\Gamma\left( \frac{k}{2} \right)} \; .\]

Proof: Combining the definition of the raw moment with the probability density function of the chi-squared distribution, we have:

\[\label{eq:chi2-mom-int} \begin{split} \mathrm{E}(X^{m}) &= \int_{0}^{\infty} x^m \frac{1}{2^{k/2} \Gamma\left( \frac{k}{2} \right)} \, x^{k/2-1} \, e^{-x/2} \, \mathrm{d}x \\ &= \frac{1}{2^{k/2} \Gamma\left( \frac{k}{2} \right)} \int_{0}^{\infty} x^{(k/2)+m-1} \, e^{-x/2} \, \mathrm{d}x \; . \end{split}\]

Now, we substitute $u = x/2$, such that $x = 2u$. As a result, we obtain:

\[\label{eq:chi2-mom-int-u} \begin{split} \mathrm{E}(X^{m}) &= \frac{1}{2^{k/2} \Gamma\left( \frac{k}{2} \right)} \int_{0}^{\infty} 2^{(k/2)+m-1} \, u^{(k/2)+m-1} \, e^{-u} \, \mathrm{d}(2u) \\ &= \frac{2^{(k/2)+m}}{2^{k/2} \Gamma\left( \frac{k}{2} \right)} \int_{0}^{\infty} u^{(k/2)+m-1} \, e^{-u} \, \mathrm{d}u \\ &= \frac{2^m}{\Gamma\left( \frac{k}{2} \right)} \int_{0}^{\infty} u^{(k/2)+m-1} \, e^{-u} \, \mathrm{d}u \; . \end{split}\]

With the definition of the gamma function as

\[\label{eq:gam-fct} \Gamma(x) = \int_{0}^{\infty} t^{x-1} \, e^{-t} \, \mathrm{d}t, \; z > 0 \; ,\]

this leads to the desired result when $m > -k/2$. Observe that, if $m$ is a nonnegative integer, then $m > -k/2$ is always true. Therefore, all moments of a chi-squared distribution exist and the $m$-th raw moment is given by the equation above.

Sources:

Metadata: ID: P175 | shortcut: chi2-mom | author: kjpetrykowski | date: 2020-10-13, 01:30.