Index: The Book of Statistical ProofsGeneral Theorems ▷ Probability theory ▷ Cumulative distribution function ▷ Cumulative distribution function of sum of independents

Theorem: Let $X$ and $Y$ be two independent random variables and let $Z = X + Y$. Then, the cumulative distribution function of $Z$ is given by

\[\label{eq:cdf-sumind} \begin{split} F_Z(z) &= \mathrm{E}\left[ F_X(z-Y) \right] \\ \text{or} \quad F_Z(z) &= \mathrm{E}\left[ F_Y(z-X) \right] \end{split}\]

where $F_X(x)$, $F_Y(y)$ and $F_Z(z)$ are the cumulative distribution functions of $X$, $Y$ and $Z$ and $\mathrm{E}\left[ \cdot \right]$ denotes the expected value.

Proof: Using the definition of the cumulative distribution function, the first equation can be derived as follows:

\[\label{eq:cdf-sumind-qed} \begin{split} F_Z(z) &= \mathrm{Pr}(Z \leq z) \\ &= \mathrm{Pr}(X + Y \leq z) \\ &= \mathrm{Pr}(X \leq z - Y) \\ &= \mathrm{E} \left[ \mathrm{Pr}(X \leq z - Y \vert Y = y) \right] \\ &= \mathrm{E} \left[ \mathrm{Pr}(X \leq z - Y) \right] \\ &= \mathrm{E} \left[ F_X(z-Y) \right] \; . \end{split}\]

Note that the second-last transition is justified by the fact that $X$ and $Y$ are independent, such that conditional probabilities are equal to marginal probabilities. The second equation can be derived by switching $X$ and $Y$.

Sources:

Metadata: ID: P256 | shortcut: cdf-sumind | author: JoramSoch | date: 2021-08-30, 08:53.