Index: The Book of Statistical ProofsStatistical ModelsUnivariate normal dataMultiple linear regression ▷ Corrected Akaike information criterion

Theorem: Consider a linear regression model m

(1)m:y=Xβ+ε,εN(0,σ2V).

Then, the corrected Akaike information criterion for this model is

(2)AICc(m)=nlog(wRSSn)+n[1+log(2π)]+log|V|+2n(p+1)np2

where wRSS is the weighted residual sum of squares, p is the number of regressors in the design matrix X and n is the number of observations in the data vector y.

Proof: The corrected Akaike information criterion is defined as

(3)AICc(m)=AIC(m)+2k2+2knk1

where AIC(m) is the Akaike information criterion, k is the number of free parameters in m and n is the number of observations.

The Akaike information criterion for multiple linear regression is given by

(4)AIC(m)=nlog(wRSSn)+n[1+log(2π)]+log|V|+2(p+1)

and the number of free paramters in multiple linear regression is k=p+1, i.e. one for each regressor in the design matrix X, plus one for the noise variance σ2.

Thus, the corrected AIC of m follows from (3) and (4) as

(5)AICc(m)=nlog(wRSSn)+n[1+log(2π)]+log|V|+2k+2k2+2knk1=nlog(wRSSn)+n[1+log(2π)]+log|V|+2nk2k22knk1+2k2+2knk1=nlog(wRSSn)+n[1+log(2π)]+log|V|+2nknk1=nlog(wRSSn)+n[1+log(2π)]+log|V|+2n(p+1)np2.
Sources:

Metadata: ID: P309 | shortcut: mlr-aicc | author: JoramSoch | date: 2022-02-11, 07:07.