Index: The Book of Statistical ProofsGeneral TheoremsProbability theory ▷ Probability ▷ Conditional independence

Definition: Generally speaking, random variables are conditionally independent given another random variable, if they are statistically independent in their conditional probability distributions given this random variable.

1) A set of discrete random variables $X_1, \ldots, X_n$ with possible values $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called conditionally independent given the random variable $Y$ with possible values $\mathcal{Y}$, if

$\label{eq:disc-ind} p(X_1 = x_1, \ldots, X_n = x_n|Y = y) = \prod_{i=1}^{n} p(X_i = x_i|Y = y) \quad \text{for all} \; x_i \in \mathcal{X}_i \quad \text{and all} \; y \in \mathcal{Y}$

where $p(x_1, \ldots, x_n \vert y)$ are the joint (conditional) probabilities of $X_1, \ldots, X_n$ given $Y$ and $p(x_i \vert y)$ are the marginal (conditional) probabilities of $X_i$ given $Y$.

2) A set of continuous random variables $X_1, \ldots, X_n$ with possible values $\mathcal{X}_1, \ldots, \mathcal{X}_n$ is called conditionally independent given the random variable $Y$ with possible values $\mathcal{Y}$, if

$\label{eq:cond-ind-F} F_{X_1,\ldots,X_n|Y=y}(x_1,\ldots,x_n) = \prod_{i=1}^{n} F_{X_i|Y=y}(x_i) \quad \text{for all} \; x_i \in \mathcal{X}_i \quad \text{and all} \; y \in \mathcal{Y}$

or equivalently, if the probability densities exist, if

$\label{eq:cont-ind-f} f_{X_1,\ldots,X_n|Y=y}(x_1,\ldots,x_n) = \prod_{i=1}^{n} f_{X_i|Y=y}(x_i) \quad \text{for all} \; x_i \in \mathcal{X}_i \quad \text{and all} \; y \in \mathcal{Y}$

where $F$ are the joint (conditional) or marginal (conditional) cumulative distribution functions and $f$ are the respective probability density functions.

Sources:

Metadata: ID: D112 | shortcut: ind-cond | author: JoramSoch | date: 2020-11-19, 05:40.