Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryOther probability functions ▷ Moment-generating function of sum of independents

Theorem: Let $X$ and $Y$ be two independent random variables and let $Z = X + Y$. Then, the moment-generating function of $Z$ is given by

\[\label{eq:mgf-sumind} M_Z(t) = M_X(t) \cdot M_Y(t)\]

where $M_X(t)$, $M_Y(t)$ and $M_Z(t)$ are the moment-generating functions of $X$, $Y$ and $Z$.

Proof: The moment-generating function of a random variable $X$ is

\[\label{eq:mfg} M_X(t) = \mathrm{E} \left( \exp \left[ t X \right] \right)\]

and therefore the moment-generating function of the sum $Z$ is given by

\[\label{eq:mgf-sumind-s1} \begin{split} M_Z(t) &= \mathrm{E} \left( \exp \left[ t Z \right] \right) \\ &= \mathrm{E} \left( \exp \left[ t (X + Y) \right] \right) \\ &= \mathrm{E} \left( \exp \left[ t X \right] \cdot \exp \left[ t Y \right] \right) \; . \end{split}\]

Because the expected value is multiplicative for independent random variables, we have

\[\label{eq:mgf-sumind-s2} \begin{split} M_Z(t) &= \mathrm{E} \left( \exp \left[ t X \right] \right) \cdot \mathrm{E} \left( \exp \left[ t Y \right] \right) \\ &= M_X(t) \cdot M_Y(t) \; . \end{split}\]
Sources:

Metadata: ID: P478 | shortcut: mgf-sumind | author: JoramSoch | date: 2024-11-08, 10:46.