Index: The Book of Statistical ProofsStatistical Models ▷ Univariate normal data ▷ Multiple linear regression ▷ Residual sum of squares

Definition: Let there be a multiple linear regression with independent observations using measured data $y$ and design matrix $X$:

\[\label{eq:mlr} y = X\beta + \varepsilon, \; \varepsilon_i \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .\]

Then, the residual sum of squares (RSS) is defined as the sum of squared deviations of the measured signal from the fitted signal:

\[\label{eq:rss} \mathrm{RSS} = \sum_{i=1}^n (y_i - \hat{y}_i)^2 \quad \text{where} \quad \hat{y} = X \hat{\beta}\]

with estimated regression coefficients $\hat{\beta}$, e.g. obtained via ordinary least squares.


Metadata: ID: D39 | shortcut: rss | author: JoramSoch | date: 2020-03-21, 22:03.