Index: The Book of Statistical ProofsStatistical Models ▷ Multivariate normal data ▷ General linear model ▷ Ordinary least squares

Theorem: Given a general linear model with independent observations

\[\label{eq:GLM} Y = X B + E, \; E \sim \mathcal{MN}(0, \sigma^2 I_n, \Sigma) \; ,\]

the ordinary least squares parameters estimates are given by

\[\label{eq:OLS} \hat{B} = (X^\mathrm{T} X)^{-1} X^\mathrm{T} Y \; .\]

Proof: Let $\hat{B}$ be the ordinary least squares (OLS) solution and let $\hat{E} = Y - X\hat{B}$ be the resulting matrix of residuals. According to the exogeneity assumption of OLS, the errors have conditional mean zero

\[\label{eq:OLS-exo} \mathrm{E}(E|X) = 0 \; ,\]

a direct consequence of which is that the regressors are uncorrelated with the errors

\[\label{eq:OLS-uncorr} \mathrm{E}(X^\mathrm{T} E) = 0 \; ,\]

which, in the finite sample, means that the residual matrix must be orthogonal to the design matrix:

\[\label{eq:X-E-orth} X^\mathrm{T} \hat{E} = 0 \; .\]

From \eqref{eq:X-E-orth}, the OLS formula can be directly derived:

\[\label{eq:OLS-qed} \begin{split} X^\mathrm{T} \hat{E} &= 0 \\ X^\mathrm{T} \left( Y - X\hat{B} \right) &= 0 \\ X^\mathrm{T} Y - X^\mathrm{T} X\hat{B} &= 0 \\ X^\mathrm{T} X\hat{B} &= X^\mathrm{T} Y \\ \hat{B} &= (X^\mathrm{T} X)^{-1} X^\mathrm{T} Y \; . \end{split}\]
Sources:

Metadata: ID: P106 | shortcut: glm-ols | author: JoramSoch | date: 2020-05-19, 06:02.