Definition: Residual sum of squares
Index:
The Book of Statistical Proofs ▷
Statistical Models ▷
Univariate normal data ▷
Multiple linear regression ▷
Residual sum of squares
Sources:
Metadata: ID: D39 | shortcut: rss | author: JoramSoch | date: 2020-03-21, 22:03.
Definition: Let there be a multiple linear regression with independent observations using measured data $y$ and design matrix $X$:
\[\label{eq:mlr} y = X\beta + \varepsilon, \; \varepsilon_i \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2) \; .\]Then, the residual sum of squares (RSS) is defined as the sum of squared deviations of the measured signal from the fitted signal:
\[\label{eq:rss} \mathrm{RSS} = \sum_{i=1}^n (y_i - \hat{y}_i)^2 \quad \text{where} \quad \hat{y} = X \hat{\beta}\]with estimated regression coefficients $\hat{\beta}$, e.g. obtained via ordinary least squares.
- Wikipedia (2020): "Residual sum of squares"; in: Wikipedia, the free encyclopedia, retrieved on 2020-03-21; URL: https://en.wikipedia.org/wiki/Residual_sum_of_squares.
Metadata: ID: D39 | shortcut: rss | author: JoramSoch | date: 2020-03-21, 22:03.