Proof: Distribution of the inverse general linear model
Index:
The Book of Statistical Proofs ▷
Statistical Models ▷
Multivariate normal data ▷
Inverse general linear model ▷
Derivation of the distribution
Metadata: ID: P267 | shortcut: iglm-dist | author: JoramSoch | date: 2021-10-21, 16:03.
Theorem: Let there be a general linear model of $Y \in \mathbb{R}^{n \times v}$
\[\label{eq:glm} Y = X B + E, \; E \sim \mathcal{MN}(0, V, \Sigma) \; .\]Then, the inverse general linear model of $X \in \mathbb{R}^{n \times p}$ is given by
\[\label{eq:iglm} X = Y W + N, \; N \sim \mathcal{MN}(0, V, \Sigma_x)\]where $W \in \mathbb{R}^{v \times p}$ is a matrix, such that $B \, W = I_p$, and the covariance across columns is $\Sigma_x = W^\mathrm{T} \Sigma W$.
Proof: The linear transformation theorem for the matrix-normal distribution states:
\[\label{eq:matn-ltt} X \sim \mathcal{MN}(M, U, V) \quad \Rightarrow \quad Y = AXB + C \sim \mathcal{MN}(AMB+C, AUA^\mathrm{T}, B^\mathrm{T}VB) \; .\]The matrix $W$ exists, if the rows of $B \in \mathbb{R}^{p \times v}$ are linearly independent, such that $\mathrm{rk}(B) = p$. Then, right-multiplying the model \eqref{eq:glm} with $W$ and applying \eqref{eq:matn-ltt} yields
\[\label{eq:iglm-s1} Y W = X B W + E W, \; E W \sim \mathcal{MN}(0, V, W^\mathrm{T} \Sigma W) \; .\]Employing $B \, W = I_p$ and rearranging, we have
\[\label{eq:iglm-s2} X = Y W - E W, \; E W \sim \mathcal{MN}(0, V, W^\mathrm{T} \Sigma W) \; .\]Substituting $N = - E W$, we get
\[\label{eq:iglm-s3} X = Y W + N, \; N \sim \mathcal{MN}(0, V, W^\mathrm{T} \Sigma W)\]which is equivalent to \eqref{eq:iglm}.
∎
Sources: - Soch J, Allefeld C, Haynes JD (2020): "Inverse transformed encoding models – a solution to the problem of correlated trial-by-trial parameter estimates in fMRI decoding"; in: NeuroImage, vol. 209, art. 116449, Appendix C, Theorem 4; URL: https://www.sciencedirect.com/science/article/pii/S1053811919310407; DOI: 10.1016/j.neuroimage.2019.116449.
Metadata: ID: P267 | shortcut: iglm-dist | author: JoramSoch | date: 2021-10-21, 16:03.