Index: The Book of Statistical ProofsGeneral Theorems ▷ Probability theory ▷ Expected value ▷ Law of total expectation

Theorem: (law of total expectation, also called “law of iterated expectations”) Let $X$ be a random variable with expected value $\mathrm{E}(X)$ and let $Y$ be any random variable defined on the same probability space. Then, the expected value of the conditional expectation of $X$ given $Y$ is the same as the expected value of $X$:

$\label{eq:mean-tot} \mathrm{E}(X) = \mathrm{E}[\mathrm{E}(X \vert Y)] \; .$

Proof: Let $X$ and $Y$ be discrete random variables with sets of possible outcomes $\mathcal{X}$ and $\mathcal{Y}$. Then, the expectation of the conditional expetectation can be rewritten as:

$\label{eq:mean-tot-s1} \begin{split} \mathrm{E}[\mathrm{E}(X \vert Y)] &= \mathrm{E}\left[ \sum_{x \in \mathcal{X}} x \cdot \mathrm{Pr}(X = x \vert Y) \right] \\ &= \sum_{y \in \mathcal{Y}} \left[ \sum_{x \in \mathcal{X}} x \cdot \mathrm{Pr}(X = x \vert Y = y) \right] \cdot \mathrm{Pr}(Y = y) \\ &= \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} x \cdot \mathrm{Pr}(X = x \vert Y = y) \cdot \mathrm{Pr}(Y = y) \; . \end{split}$

Using the law of conditional probability, this becomes:

$\label{eq:mean-tot-s2} \begin{split} \mathrm{E}[\mathrm{E}(X \vert Y)] &= \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} x \cdot \mathrm{Pr}(X = x, Y = y) \\ &= \sum_{x \in \mathcal{X}} x \sum_{y \in \mathcal{Y}} \mathrm{Pr}(X = x, Y = y) \; . \end{split}$

Using the law of marginal probability, this becomes:

$\label{eq:mean-tot-s3} \begin{split} \mathrm{E}[\mathrm{E}(X \vert Y)] &= \sum_{x \in \mathcal{X}} x \cdot \mathrm{Pr}(X = x) \\ &= \mathrm{E}(X) \; . \end{split}$
Sources:

Metadata: ID: P291 | shortcut: mean-tot | author: JoramSoch | date: 2021-11-26, 10:57.