Index: The Book of Statistical ProofsModel SelectionGoodness-of-fit measuresResidual variance ▷ Construction of unbiased estimator (p = 1)

Theorem: Let $y = \left\lbrace y_1, \ldots, y_n \right\rbrace$ be a set of independent normally distributed observations with unknown mean $\mu$ and variance $\sigma^2$:

\[\label{eq:ug} y_i \overset{\text{i.i.d.}}{\sim} \mathcal{N}(\mu, \sigma^2), \quad i = 1,\ldots,n \; .\]

An unbiased estimator of $\sigma^2$ is given by

\[\label{eq:resvar-unb} \hat{\sigma}^2_{\mathrm{unb}} = \frac{1}{n-1} \sum_{i=1}^{n} \left( y_i - \bar{y} \right)^2 \; .\]

Proof: It can be shown that the maximum likelihood estimator of $\sigma^2$

\[\label{eq:resvar-mle} \hat{\sigma}^2_{\mathrm{MLE}} = \frac{1}{n} \sum_{i=1}^{n} \left( y_i - \bar{y} \right)^2\]

is a biased estimator in the sense that

\[\label{eq:resvar-bias} \mathbb{E}\left[ \hat{\sigma}^2_{\mathrm{MLE}} \right] = \frac{n-1}{n} \sigma^2 \; .\]

From \eqref{eq:resvar-bias}, it follows that

\[\label{eq:resvar-bias-adj} \begin{split} \mathbb{E}\left[ \frac{n}{n-1} \hat{\sigma}^2_{\mathrm{MLE}} \right] &= \frac{n}{n-1} \mathbb{E}\left[ \hat{\sigma}^2_{\mathrm{MLE}} \right] \\ &\overset{\eqref{eq:resvar-bias}}{=} \frac{n}{n-1} \cdot \frac{n-1}{n} \sigma^2 \\ &= \sigma^2 \; , \end{split}\]

such that an unbiased estimator can be constructed as

\[\label{eq:resvar-unb-qed} \begin{split} \hat{\sigma}^2_{\mathrm{unb}} &= \frac{n}{n-1} \hat{\sigma}^2_{\mathrm{MLE}} \\ &\overset{\eqref{eq:resvar-mle}}{=} \frac{n}{n-1} \cdot \frac{1}{n} \sum_{i=1}^{n} \left( y_i - \bar{y} \right)^2 \\ &= \frac{1}{n-1} \sum_{i=1}^{n} \left( y_i - \bar{y} \right)^2 \; . \end{split}\]
Sources:

Metadata: ID: P62 | shortcut: resvar-unb | author: JoramSoch | date: 2020-02-25, 15:38.