Proof: Relationship between correlation coefficient and slope estimate in simple linear regression
Index:
The Book of Statistical Proofs ▷
Statistical Models ▷
Univariate normal data ▷
Simple linear regression ▷
Correlation coefficient in terms of slope estimate
Metadata: ID: P279 | shortcut: slr-corr | author: JoramSoch | date: 2021-10-27, 14:58.
Theorem: Assume a simple linear regression model with independent observations
\[\label{eq:slr} y = \beta_0 + \beta_1 x + \varepsilon, \; \varepsilon_i \sim \mathcal{N}(0, \sigma^2), \; i = 1,\ldots,n\]and consider estimation using ordinary least squares. Then, correlation coefficient and the estimated value of the slope parameter are related to each other via the sample standard deviations:
\[\label{eq:slr-corr} r_{xy} = \frac{s_x}{s_y} \, \hat{\beta}_1 \; .\]Proof: The ordinary least squares estimate of the slope is given by
\[\label{eq:slr-ols-sl} \hat{\beta}_1 = \frac{s_{xy}}{s_x^2} \; .\]Using the relationship between covariance and correlation
\[\label{eq:cov-corr} \mathrm{Cov}(X,Y) = \sigma_X \, \mathrm{Corr}(X,Y) \, \sigma_Y\]which also holds for sample correlation and sample covariance
\[\label{eq:cov-corr-samp} s_{xy} = s_x \, r_{xy} \, s_y \; ,\]we get the final result:
\[\label{eq:slr-corr-qed} \begin{split} \hat{\beta}_1 &= \frac{s_{xy}}{s_x^2} \\ \hat{\beta}_1 &= \frac{s_x \, r_{xy} \, s_y}{s_x^2} \\ \hat{\beta}_1 &= \frac{s_y}{s_x} \, r_{xy} \\ \Leftrightarrow \quad r_{xy} &= \frac{s_x}{s_y} \, \hat{\beta}_1 \; . \end{split}\]∎
Sources: - Penny, William (2006): "Relation to correlation"; in: Mathematics for Brain Imaging, ch. 1.2.3, p. 18, eq. 1.27; URL: https://ueapsylabs.co.uk/sites/wpenny/mbi/mbi_course.pdf.
- Wikipedia (2021): "Simple linear regression"; in: Wikipedia, the free encyclopedia, retrieved on 2021-10-27; URL: https://en.wikipedia.org/wiki/Simple_linear_regression#Fitting_the_regression_line.
Metadata: ID: P279 | shortcut: slr-corr | author: JoramSoch | date: 2021-10-27, 14:58.