Index: The Book of Statistical ProofsStatistical Models ▷ Univariate normal data ▷ Simple linear regression ▷ Correlation of estimates

Theorem: In simple linear regression, when the independent variable $x$ is mean-centered, the ordinary least squares estimates for slope and intercept are uncorrelated.

Proof: The parameter estimates for simple linear regression are bivariate normally distributed under ordinary least squares:

\[\label{eq:slr-olsdist} \left[ \begin{matrix} \hat{\beta}_0 \\ \hat{\beta}_1 \end{matrix} \right] \sim \mathcal{N}\left( \left[ \begin{matrix} \beta_0 \\ \beta_1 \end{matrix} \right], \, \frac{\sigma^2}{(n-1) \, s_x^2} \cdot \left[ \begin{matrix} x^\mathrm{T}x/n & -\bar{x} \\ -\bar{x} & 1 \end{matrix} \right] \right)\]

Because the covariance matrix of the multivariate normal distribution contains the pairwise covariances of the random variables, we can deduce that the covariance of $\hat{\beta}_0$ and $\hat{\beta}_1$ is:

\[\label{eq:slr-olscov} \mathrm{Cov}\left( \hat{\beta}_0, \hat{\beta}_1 \right) = -\frac{\sigma^2 \, \bar{x}}{(n-1) \, s_x^2}\]

where $\sigma^2$ is the noise variance, $s_x^2$ is the sample variance of $x$ and $n$ is the number of observations. When $x$ is mean-centered, we have $\bar{x} = 0$, such that:

\[\label{eq:slr-olscov-meancent} \mathrm{Cov}\left( \hat{\beta}_0, \hat{\beta}_1 \right) = 0 \; .\]

Because correlation is equal to covariance divided by standard deviations, we can conclude that the correlation of $\hat{\beta}_0$ and $\hat{\beta}_1$ is also zero:

\[\label{eq:slr-olscorr-qed} \mathrm{Corr}\left( \hat{\beta}_0, \hat{\beta}_1 \right) = 0 \; .\]
Sources:

Metadata: ID: P320 | shortcut: slr-olscorr | author: JoramSoch | date: 2022-04-14, 17:17.