Index: The Book of Statistical ProofsStatistical ModelsUnivariate normal dataSimple linear regression ▷ Correlation coefficient in terms of slope estimate

Theorem: Assume a simple linear regression model with independent observations

\[\label{eq:slr} y = \beta_0 + \beta_1 x + \varepsilon, \; \varepsilon_i \sim \mathcal{N}(0, \sigma^2), \; i = 1,\ldots,n\]

and consider estimation using ordinary least squares. Then, correlation coefficient and the estimated value of the slope parameter are related to each other via the sample standard deviations:

\[\label{eq:slr-corr} r_{xy} = \frac{s_x}{s_y} \, \hat{\beta}_1 \; .\]

Proof: The ordinary least squares estimate of the slope is given by

\[\label{eq:slr-ols-sl} \hat{\beta}_1 = \frac{s_{xy}}{s_x^2} \; .\]

Using the relationship between covariance and correlation

\[\label{eq:cov-corr} \mathrm{Cov}(X,Y) = \sigma_X \, \mathrm{Corr}(X,Y) \, \sigma_Y\]

which also holds for sample correlation and sample covariance

\[\label{eq:cov-corr-samp} s_{xy} = s_x \, r_{xy} \, s_y \; ,\]

we get the final result:

\[\label{eq:slr-corr-qed} \begin{split} \hat{\beta}_1 &= \frac{s_{xy}}{s_x^2} \\ \hat{\beta}_1 &= \frac{s_x \, r_{xy} \, s_y}{s_x^2} \\ \hat{\beta}_1 &= \frac{s_y}{s_x} \, r_{xy} \\ \Leftrightarrow \quad r_{xy} &= \frac{s_x}{s_y} \, \hat{\beta}_1 \; . \end{split}\]
Sources:

Metadata: ID: P279 | shortcut: slr-corr | author: JoramSoch | date: 2021-10-27, 14:58.