Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryProbability ▷ Statistical independence

Definition: Generally speaking, random variables are statistically independent, if their joint probability can be expressed in terms of their marginal probabilities.


1) A set of discrete random variables $X_1, \ldots, X_n$ with possible values $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called statistically independent, if

\[\label{eq:disc-ind} p(X_1 = x_1, \ldots, X_n = x_n) = \prod_{i=1}^{n} p(X_i = x_i) \quad \text{for all} \; x_i \in \mathcal{X}_i, \; i = 1, \ldots, n\]

where $p(x_1, \ldots, x_n)$ are the joint probabilities of $X_1, \ldots, X_n$ and $p(x_i)$ are the marginal probabilities of $X_i$.


2) A set of continuous random variables $X_1, \ldots, X_n$ defined on the domains $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called statistically independent, if

\[\label{eq:cont-ind-F} F_{X_1,\ldots,X_n}(x_1,\ldots,x_n) = \prod_{i=1}^{n} F_{X_i}(x_i) \quad \text{for all} \; x_i \in \mathcal{X}_i, \; i = 1, \ldots, n\]

or equivalently, if the probability densities exist, if

\[\label{eq:cont-ind-f} f_{X_1,\ldots,X_n}(x_1,\ldots,x_n) = \prod_{i=1}^{n} f_{X_i}(x_i) \quad \text{for all} \; x_i \in \mathcal{X}_i, \; i = 1, \ldots, n\]

where $F$ are the joint or marginal cumulative distribution functions and $f$ are the respective probability density functions.

 
Sources:

Metadata: ID: D75 | shortcut: ind | author: JoramSoch | date: 2020-06-06, 07:16.