Index: The Book of Statistical ProofsStatistical Models ▷ Multivariate normal data ▷ Inverse general linear model ▷ Derivation of parameters

Theorem: Let there be observations $Y \in \mathbb{R}^{n \times v}$ and $X \in \mathbb{R}^{n \times p}$ and consider a weight matrix $W = f(Y,X) \in \mathbb{R}^{v \times p}$ predicting $X$ from $Y$:

$\label{eq:bda} \hat{X} = Y W \; .$

Then, the parameter matrix of the corresponding forward model is equal to

$\label{eq:cfm-para} A = \Sigma_y W \Sigma_x^{-1}$

with the “sample covariances

$\label{eq:Sx-Sy} \begin{split} \Sigma_x &= \hat{X}^\mathrm{T} \hat{X} \\ \Sigma_y &= Y^\mathrm{T} Y \; . \end{split}$

Proof: The corresponding forward model is given by

$\label{eq:cfm} Y = \hat{X} A^\mathrm{T} + E \; ,$

subject to the constraint that predicted $X$ and errors $E$ are uncorrelated:

$\label{eq:cfm-con} \hat{X}^\mathrm{T} E = 0 \; .$

With that, we can directly derive the parameter matrix $A$:

$\label{eq:cfm-para-qed} \begin{split} Y &\overset{\eqref{eq:cfm}}{=} \hat{X} A^\mathrm{T} + E \\ \hat{X} A^\mathrm{T} &= Y - E \\ \hat{X}^\mathrm{T} \hat{X} A^\mathrm{T} &= \hat{X}^\mathrm{T} (Y - E) \\ \hat{X}^\mathrm{T} \hat{X} A^\mathrm{T} &= \hat{X}^\mathrm{T} Y - \hat{X}^\mathrm{T} E \\ \hat{X}^\mathrm{T} \hat{X} A^\mathrm{T} &\overset{\eqref{eq:cfm-con}}{=} \hat{X}^\mathrm{T} Y \\ \hat{X}^\mathrm{T} \hat{X} A^\mathrm{T} &\overset{\eqref{eq:bda}}{=} W^\mathrm{T} Y^\mathrm{T} Y \\ \Sigma_x A^\mathrm{T} &\overset{\eqref{eq:Sx-Sy}}{=} W^\mathrm{T} \Sigma_y \\ A^\mathrm{T} &= \Sigma_x^{-1} W^\mathrm{T} \Sigma_y \\ A &= \Sigma_y W \Sigma_x^{-1} \; . \end{split}$
Sources:

Metadata: ID: P269 | shortcut: cfm-para | author: JoramSoch | date: 2021-10-21, 17:20.