Index: The Book of Statistical ProofsStatistical Models ▷ Univariate normal data ▷ Simple linear regression ▷ Regression line

Definition: Let there be a simple linear regression with independent observations using dependent variable $y$ and independent variable $x$:

\[\label{eq:slr} y_i = \beta_0 + \beta_1 x_i + \varepsilon_i, \; \varepsilon_i \sim \mathcal{N}(0, \sigma^2) \; .\]

Then, given some parameters $\beta_0, \beta_1 \in \mathbb{R}$, the set

\[\label{eq:regline} L(\beta_0, \beta_1) = \left\lbrace (x,y) \in \mathbb{R}^2 \mid y = \beta_0 + \beta_1 x \right\rbrace\]

is called a “regression line” and the set

\[\label{eq:regline-ols} L(\hat{\beta}_0, \hat{\beta}_1)\]

is called the “fitted regression line”, with estimated regression coefficients $\hat{\beta}_0, \hat{\beta}_1$, e.g. obtained via ordinary least squares.

 
Sources:

Metadata: ID: D164 | shortcut: regline | author: JoramSoch | date: 2021-10-27, 07:30.