Index: The Book of Statistical ProofsProbability Distributions ▷ Univariate continuous distributions ▷ Chi-squared distribution ▷ Moments

Theorem: Let $X$ be a random variable following a chi-squared distribution:

$\label{eq:chi2} X \sim \chi^{2}(k) \; .$

If $m > -k/2$, then $E(X^{m})$ exists and is equal to:

$\label{eq:chi2-mom} \mathrm{E}(X^{m}) = \frac{2^{m} \Gamma\left( \frac{k}{2}+m \right)}{\Gamma\left( \frac{k}{2} \right)} \; .$

Proof: Combining the definition of the $m$-th raw moment with the probability density function of the chi-squared distribution, we have:

$\label{eq:chi2-mom-int} \mathrm{E}(X^{m}) = \int_{0}^{\infty} \frac{1}{\Gamma\left( \frac{k}{2} \right) 2^{k/2}} \, x^{(k/2)+m-1} \, e^{-x/2} \mathrm{d}x \; .$

Now define a new variable $u = x/2$. As a result, we obtain:

$\label{eq:chi-2-mom-int-u} \mathrm{E}(X^{m}) = \int_{0}^{\infty} \frac{1}{\Gamma\left( \frac{k}{2} \right) 2^{(k/2)-1}} \, 2^{(k/2)+m-1} \, u^{(k/2)+m-1} \, e^{-u} \mathrm{d}u \; .$

This leads to the desired result when $m > -k/2$. Observe that, if $m$ is a nonnegative integer, then $m > -k/2$ is always true. Therefore, all moments of a chi-squared distribution exist and the $m$-th raw moment is given by the foregoing equation.

Sources:

Metadata: ID: P175 | shortcut: chi2-mom | author: kjpetrykowski | date: 2020-10-13, 01:30.