Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryCovariance ▷ Positive semi-definiteness

Theorem: Each covariance matrix is positive semi-definite:

\[\label{eq:covmat-symm} a^\mathrm{T} \Sigma_{XX} a \geq 0 \quad \text{for all} \quad a \in \mathbb{R}^n \; .\]

Proof: The covariance matrix of $X$ can be expressed in terms of expected values as follows

\[\label{eq:covmat} \Sigma_{XX} = \Sigma(X) = \mathrm{E}\left[ (X-\mathrm{E}[X]) (X-\mathrm{E}[X])^\mathrm{T} \right]\]

A positive semi-definite matrix is a matrix whose eigenvalues are all non-negative or, equivalently,

\[\label{eq:psd} M \; \text{pos. semi-def.} \quad \Leftrightarrow \quad x^\mathrm{T} M x \geq 0 \quad \text{for all} \quad x \in \mathbb{R}^n \; .\]

Here, for an arbitrary real column vector $a \in \mathbb{R}^n$, we have:

\[\label{eq:covmat-symm-s1} a^\mathrm{T} \Sigma_{XX} a \overset{\eqref{eq:covmat}}{=} a^\mathrm{T} \mathrm{E}\left[ (X-\mathrm{E}[X]) (X-\mathrm{E}[X])^\mathrm{T} \right] a \; .\]

Because the expected value is a linear operator, we can write:

\[\label{eq:covmat-symm-s2} a^\mathrm{T} \Sigma_{XX} a = \mathrm{E}\left[ a^\mathrm{T} (X-\mathrm{E}[X]) (X-\mathrm{E}[X])^\mathrm{T} a \right] \; .\]

Now define the scalar random variable

\[\label{eq:Y-X} Y = a^\mathrm{T} (X-\mu_X) \; .\]

where $\mu_X = \mathrm{E}[X]$ and note that

\[\label{eq:YT-Y} a^\mathrm{T} (X-\mu_X) = (X-\mu_X)^\mathrm{T} a \; .\]

Thus, combing \eqref{eq:covmat-symm-s2} with \eqref{eq:Y-X}, we have:

\[\label{eq:covmat-symm-s3} a^\mathrm{T} \Sigma_{XX} a = \mathrm{E}\left[ Y^2 \right] \; .\]

Because $Y^2$ is a random variable that cannot become negative and the expected value of a strictly non-negative random variable is also non-negative, we finally have

\[\label{eq:covmat-symm-s4} a^\mathrm{T} \Sigma_{XX} a \geq 0\]

for any $a \in \mathbb{R}^n$.

Sources:

Metadata: ID: P351 | shortcut: covmat-psd | author: JoramSoch | date: 2022-09-26, 11:26.