Index: The Book of Statistical ProofsModel Selection ▷ Goodness-of-fit measures ▷ Signal-to-noise ratio ▷ Relationship with R²

Theorem: Let there be a linear regression model with independent observations

$\label{eq:mlr} y = X\beta + \varepsilon, \; \varepsilon_i \overset{\mathrm{i.i.d.}}{\sim} \mathcal{N}(0, \sigma^2)$

and parameter estimates obtained with ordinary least squares

$\label{eq:OLS} \hat{\beta} = (X^\mathrm{T} X)^{-1} X^\mathrm{T} y \; .$

Then, the signal-to noise ratio can be expressed in terms of the coefficient of determination

$\label{eq:SNR-R2} \mathrm{SNR} = \frac{R^2}{\mathrm{1-R^2}}$

and vice versa

$\label{eq:R2-SNR} R^2 = \frac{\mathrm{SNR}}{\mathrm{1+\mathrm{SNR}}} \; ,$

if the predicted signal mean is equal to the actual signal mean.

Proof: The signal-to-noise ratio (SNR) is defined as

$\label{eq:SNR} \mathrm{SNR} = \frac{\mathrm{Var}(X\hat{\beta})}{\hat{\sigma}^2} = \frac{\mathrm{Var}(\hat{y})}{\hat{\sigma}^2} \; .$

Writing out the variances, we have

$\label{eq:SNR-s1} \mathrm{SNR} = \frac{\frac{1}{n} \sum_{i=1}^{n} (\hat{y}_i - \bar{\hat{y}})^2}{\frac{1}{n} \sum_{i=1}^{n} (y_i - \hat{y}_i)^2} = \frac{\sum_{i=1}^{n} (\hat{y}_i - \bar{\hat{y}})^2}{\sum_{i=1}^{n} (y_i - \hat{y}_i)^2} \; .$

Note that it is irrelevant whether we use the biased estimator of the variance (dividing by $n$) or the unbiased estimator fo the variance (dividing by $n-1$), because the relevant terms cancel out.

If the predicted signal mean is equal to the actual signal mean – which is the case when variable regressors in $X$ have mean zero, such that they are orthogonal to a constant regressor in $X$ –, this means that $\bar{\hat{y}} = \bar{y}$, such that

$\label{eq:SNR-s2} \mathrm{SNR} = \frac{\sum_{i=1}^{n} (\hat{y}_i - \bar{y})^2}{\sum_{i=1}^{n} (y_i - \hat{y}_i)^2} \; .$

Then, the SNR can be written in terms of the explained, residual and total sum of squares:

$\label{eq:SNR-s3} \mathrm{SNR} = \frac{\mathrm{ESS}}{\mathrm{RSS}} = \frac{\mathrm{ESS}/\mathrm{TSS}}{\mathrm{RSS}/\mathrm{TSS}} \; .$

With the derivation of the coefficient of determination, this becomes

$\label{eq:SNR-R2-qed} \mathrm{SNR} = \frac{R^2}{1-R^2} \; .$

Rearranging this equation for the coefficient of determination, we have

$\label{eq:R2-SNR-qed} R^2 = \frac{\mathrm{SNR}}{\mathrm{1+\mathrm{SNR}}} \; ,$
Sources:

Metadata: ID: P63 | shortcut: snr-rsq | author: JoramSoch | date: 2020-02-26, 10:37.