Index: The Book of Statistical ProofsGeneral Theorems ▷ Probability theory ▷ Probability mass function ▷ Probability mass function of sum of independents

Theorem: Let $X$ and $Y$ be two independent discrete random variables with possible values $\mathcal{X}$ and $\mathcal{Y}$ and let $Z = X + Y$. Then, the probability mass function of $Z$ is given by

$\label{eq:pmf-sumind} \begin{split} f_Z(z) &= \sum_{y \in \mathcal{Y}} f_X(z-y) f_Y(y) \\ \text{or} \quad f_Z(z) &= \sum_{x \in \mathcal{X}} f_Y(z-x) f_X(x) \end{split}$

where $f_X(x)$, $f_Y(y)$ and $f_Z(z)$ are the probability mass functions of $X$, $Y$ and $Z$.

Proof: Using the definition of the probability mass function and the expected value, the first equation can be derived as follows:

$\label{eq:pmf-sumind-s1} \begin{split} f_Z(z) &= \mathrm{Pr}(Z = z) \\ &= \mathrm{Pr}(X + Y = z) \\ &= \mathrm{Pr}(X = z - Y) \\ &= \mathrm{E} \left[ \mathrm{Pr}(X = z - Y \vert Y = y) \right] \; . \end{split}$

By construction, $X$ and $Y$ are independent, such that conditional probabilities are equal to marginal probabilities, i.e. $\mathrm{Pr}(X = z - Y \vert Y = y) = \mathrm{Pr}(X = z - Y)$ and we have:

$\label{eq:pmf-sumind-s2} \begin{split} f_Z(z) &= \mathrm{E} \left[ \mathrm{Pr}(X = z - Y) \right] \\ &= \mathrm{E} \left[ f_X(z-Y) \right] \\ &= \sum_{y \in \mathcal{Y}} f_X(z-y) f_Y(y) \; . \end{split}$

The second equation can be derived by switching $X$ and $Y$.

Sources:

Metadata: ID: P257 | shortcut: pmf-sumind | author: JoramSoch | date: 2021-08-30, 09:14.