Index: The Book of Statistical ProofsProbability Distributions ▷ Multivariate continuous distributions ▷ Multivariate t-distribution ▷ Relationship to F-distribution

Theorem: Let $X$ be a $n \times 1$ random vector following a multivariate t-distribution with mean $\mu$, scale matrix $\Sigma$ and degrees of freedom $\nu$:

\[\label{eq:X} X \sim t(\mu, \Sigma, \nu) \; .\]

Then, the centered, weighted and standardized quadratic form of $X$ follows an F-distribution with degrees of freedom $n$ and $\nu$:

\[\label{eq:mvt-f} (X-\mu)^\mathrm{T} \, \Sigma^{-1} (X-\mu)/n \sim F(n, \nu) \; .\]

Proof: The linear transformation theorem for the multivariate t-distribution states

\[\label{eq:mvt-ltt} x \sim t(\mu, \Sigma, \nu) \quad \Rightarrow \quad y = Ax + b \sim t(A\mu + b, A \Sigma A^\mathrm{T}, \nu)\]

where $x$ is an $n \times 1$ random vector following a multivariate t-distribution, $A$ is an $m \times n$ matrix and $b$ is an $m \times 1$ vector. Define the following quantities

\[\label{eq:YZ} \begin{split} Y = \Sigma^{-1/2} (X-\mu) = \Sigma^{-1/2} X - \Sigma^{-1/2} \mu \\ Z = Y^\mathrm{T} Y / n = (X-\mu)^\mathrm{T} \, \Sigma^{-1} (X-\mu)/n \end{split}\]

where $\Sigma^{-1/2}$ is a matrix square root of the inverse of $\Sigma$. Then, applying \eqref{eq:mvt-ltt} to \eqref{eq:YZ} with \eqref{eq:X}, one obtains the distribution of $Y$ as

\[\label{eq:Y-dist} \begin{split} Y &\sim t(\Sigma^{-1/2} \mu - \Sigma^{-1/2} \mu, \Sigma^{-1/2} \Sigma \, \Sigma^{-1/2}, \nu) \\ &= t(0_n, \Sigma^{-1/2} \Sigma^{1/2} \Sigma^{1/2} \Sigma^{-1/2}, \nu) \\ &= t(0_n, I_n, \nu) \; , \end{split}\]

i.e. the marginal distributions of the individual entries of $Y$ are univariate t-distributions with $\nu$ degrees of freedom:

\[\label{eq:yi-dist} Y_i \sim t(\nu), \; i = 1,\ldots,n \; .\]

Note that, when $X$ follows a t-distribution with $n$ degrees of freedom, this is equivalent to an expression of $X$ in terms of a standard normal random variable $Z$ and a chi-squared random variable $V$:

\[\label{eq:t} X \sim t(n) \quad \Leftrightarrow \quad X = \frac{Z}{\sqrt{V/n}} \quad \text{with independent} \quad Z \sim \mathcal{N}(0,1) \quad \text{and} \quad V \sim \chi^2(n) \; .\]

With that, $Z$ from \eqref{eq:YZ} can be rewritten as follows:

\[\label{eq:Z-eq-s1} \begin{split} Z &\overset{\eqref{eq:YZ}}{=} Y^\mathrm{T} Y / n \\ &= \frac{1}{n} \sum_{i=1}^n Y_i^2 \\ &\overset{\eqref{eq:t}}{=} \frac{1}{n} \sum_{i=1}^n \left( \frac{Z_i}{\sqrt{V/\nu}} \right)^2 \\ &= \frac{\left( \sum_{i=1}^n Z_i^2 \right)/n}{V/\nu} \; . \end{split}\]

Because by definition, the sum of squared standard normal random variables follows a chi-squared distribution

\[\label{eq:chi2} X_i \sim \mathcal{N}(0,1), \; i = 1,\ldots,n \quad \Rightarrow \quad \sum_{i=1}^n X_i^2 \sim \chi^2(n) \; ,\]

the quantity $Z$ becomes a ratio of the following form

\[\label{eq:Z-eq-s2} Z = \frac{W/n}{V/\nu} \quad \text{with} \quad W \sim \chi^2(n) \quad \text{and} \quad V \sim \chi^2(\nu) \; ,\]

such that $Z$, by definition, follows an F-distribution:

\[\label{eq:Z-dist} Z = \frac{W/n}{V/\nu} \sim F(n, \nu) \; .\]
Sources:

Metadata: ID: P231 | shortcut: mvt-f | author: JoramSoch | date: 2021-05-04, 10:29.