Proof: Independence of products of multivariate normal random vector
Theorem: Let $X$ be an $n \times 1$ random vector following a multivariate normal distribution:
\[\label{eq:mvn} X \sim \mathcal{N}(\mu, \Sigma)\]and consider two matrices $A \in \mathbb{R}^{k \times n}$ and $B \in \mathbb{R}^{l \times n}$. Then, $AX$ and $BX$ are independent, if and only if the cross-matrix product, weighted with the covariance matrix is equal to the zero matrix:
\[\label{eq:mvn-indprod} AX \quad \text{and} \quad BX \quad \text{ind.} \quad \Leftrightarrow \quad A \Sigma B^\mathrm{T} = 0_{kl} \; .\]Proof: Define a new random vector $C$ as
\[\label{eq:C} C = \left[ \begin{array}{c} A \\ B \end{array} \right] \in \mathbb{R}^{(k+l) \times n} \; .\]Then, due to the linear transformation theorem, we have
\[\label{eq:CX} CX = \left[ \begin{array}{c} AX \\ BX \end{array} \right] \sim \mathcal{N}\left( \left[ \begin{array}{c} A\mu \\ B\mu \end{array} \right], C \Sigma C^\mathrm{T} \right)\]with the combined covariance matrix
\[\label{eq:CSC} C \Sigma C^\mathrm{T} = \left[ \begin{array}{cc} A \Sigma A^\mathrm{T} & A \Sigma B^\mathrm{T} \\ B \Sigma A^\mathrm{T} & B \Sigma B^\mathrm{T} \end{array} \right] \; .\]We know that the necessary and sufficient condition for two components of a multivariate normal random vector to be independent is that their entries in the covariance matrix are zero. Thus, $AX$ and $BX$ are independent, if and only if
\[\label{eq:mvn-indprod-qed} A \Sigma B^\mathrm{T} = (B \Sigma A^\mathrm{T})^\mathrm{T} = 0_{kl}\]where $0_{kl}$ is the $k \times l$ zero matrix. This proves the result in \eqref{eq:mvn-indprod}.
- jld (2018): "Understanding t-test for linear regression"; in: StackExchange CrossValidated, retrieved on 2022-12-13; URL: https://stats.stackexchange.com/a/344008.
Metadata: ID: P394 | shortcut: mvn-indprod | author: JoramSoch | date: 2022-12-13, 16:44.