Proof: Moment-generating function of a sum of independent random variables
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Probability theory ▷
Other probability functions ▷
Moment-generating function of sum of independents
Metadata: ID: P478 | shortcut: mgf-sumind | author: JoramSoch | date: 2024-11-08, 10:46.
Theorem: Let $X$ and $Y$ be two independent random variables and let $Z = X + Y$. Then, the moment-generating function of $Z$ is given by
\[\label{eq:mgf-sumind} M_Z(t) = M_X(t) \cdot M_Y(t)\]where $M_X(t)$, $M_Y(t)$ and $M_Z(t)$ are the moment-generating functions of $X$, $Y$ and $Z$.
Proof: The moment-generating function of a random variable $X$ is
\[\label{eq:mfg} M_X(t) = \mathrm{E} \left( \exp \left[ t X \right] \right)\]and therefore the moment-generating function of the sum $Z$ is given by
\[\label{eq:mgf-sumind-s1} \begin{split} M_Z(t) &= \mathrm{E} \left( \exp \left[ t Z \right] \right) \\ &= \mathrm{E} \left( \exp \left[ t (X + Y) \right] \right) \\ &= \mathrm{E} \left( \exp \left[ t X \right] \cdot \exp \left[ t Y \right] \right) \; . \end{split}\]Because the expected value is multiplicative for independent random variables, we have
\[\label{eq:mgf-sumind-s2} \begin{split} M_Z(t) &= \mathrm{E} \left( \exp \left[ t X \right] \right) \cdot \mathrm{E} \left( \exp \left[ t Y \right] \right) \\ &= M_X(t) \cdot M_Y(t) \; . \end{split}\]∎
Sources: - Probability Fact (2021): "If X and Y are independent, the moment generating function (MGF)"; in: X, retrieved on 2024-11-08; URL: https://x.com/ProbFact/status/1468264616706859016.
Metadata: ID: P478 | shortcut: mgf-sumind | author: JoramSoch | date: 2024-11-08, 10:46.