Index: The Book of Statistical ProofsModel Selection ▷ Classical information criteria ▷ Akaike information criterion ▷ Corrected AIC and maximum log-likelihood

Theorem: The corrected Akaike information criterion of a generative model with likelihood function $p(y \vert \theta, m)$ is equal to

\[\label{eq:aicc-mll} \mathrm{AIC}_\mathrm{c}(m) = -2 \log p(y | \hat{\theta}, m) + \frac{2nk}{n-k-1}\]

where $\log p(y \vert \hat{\theta}, m)$ is the maximum log-likelihood, $k$ is the number of free parameters and $n$ is the number of observations.

Proof: The Akaike information criterion (AIC) is defined as

\[\label{eq:aic} \mathrm{AIC}(m) = -2 \log p(y | \hat{\theta}, m) + 2 \, k\]

and the corrected Akaike information criterion is defined as

\[\label{eq:aicc} \mathrm{AIC}_\mathrm{c}(m) = \mathrm{AIC}(m) + \frac{2k^2 + 2k}{n-k-1} \; .\]

Plugging \eqref{eq:aic} into \eqref{eq:aicc}, we obtain:

\[\label{eq:aicc-mll-qed} \begin{split} \mathrm{AIC}_\mathrm{c}(m) &= -2 \log p(y | \hat{\theta}, m) + 2 \, k + \frac{2k^2 + 2k}{n-k-1} \\ &= -2 \log p(y | \hat{\theta}, m) + \frac{2k(n-k-1)}{n-k-1} + \frac{2k^2 + 2k}{n-k-1} \\ &= -2 \log p(y | \hat{\theta}, m) + \frac{2nk - 2k^2 - 2k}{n-k-1} + \frac{2k^2 + 2k}{n-k-1} \\ &= -2 \log p(y | \hat{\theta}, m) + \frac{2nk}{n-k-1} \; . \end{split}\]
Sources:

Metadata: ID: P315 | shortcut: aicc-mll | author: JoramSoch | date: 2022-03-11, 16:53.