Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryExpected value ▷ Linearity

Theorem: The expected value is a linear operator, i.e.

\[\label{eq:mean-lin} \begin{split} \mathrm{E}(X + Y) &= \mathrm{E}(X) + \mathrm{E}(Y) \\ \mathrm{E}(a\,X) &= a\,\mathrm{E}(X) \end{split}\]

for random variables $X$ and $Y$ and a constant $a$.

Proof:

1) If $X$ and $Y$ are discrete random variables, the expected value is

\[\label{eq:mean-disc} \mathrm{E}(X) = \sum_{x \in \mathcal{X}} x \cdot f_X(x)\]

and the law of marginal probability states that

\[\label{eq:lmp-disc} p(x) = \sum_{y \in \mathcal{Y}} p(x,y) \; .\]

Applying this, we have

\[\label{eq:mean-lin-s1-disc} \begin{split} \mathrm{E}(X + Y) &= \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} (x+y) \cdot f_{X,Y}(x,y) \\ &= \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} x \cdot f_{X,Y}(x,y) + \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} y \cdot f_{X,Y}(x,y) \\ &= \sum_{x \in \mathcal{X}} x \sum_{y \in \mathcal{Y}} f_{X,Y}(x,y) + \sum_{y \in \mathcal{Y}} y \sum_{x \in \mathcal{X}} f_{X,Y}(x,y) \\ &\overset{\eqref{eq:lmp-disc}}{=} \sum_{x \in \mathcal{X}} x \cdot f_X(x) + \sum_{y \in \mathcal{Y}} y \cdot f_{Y}(y) \\ &\overset{\eqref{eq:mean-disc}}{=} \mathrm{E}(X) + \mathrm{E}(Y) \end{split}\]

as well as

\[\label{eq:mean-lin-s2-disc} \begin{split} \mathrm{E}(a\,X) &= \sum_{x \in \mathcal{X}} a \, x \cdot f_X(x) \\ &= a \, \sum_{x \in \mathcal{X}} x \cdot f_X(x) \\ &\overset{\eqref{eq:mean-disc}}{=} a \, \mathrm{E}(X) \; . \end{split}\]


2) If $X$ and $Y$ are continuous random variables, the expected value is

\[\label{eq:mean-cont} \mathrm{E}(X) = \int_{\mathcal{X}} x \cdot f_X(x) \, \mathrm{d}x\]

and the law of marginal probability states that

\[\label{eq:lmp-cont} p(x) = \int_{\mathcal{Y}} p(x,y) \, \mathrm{d}y \; .\]

Applying this, we have

\[\label{eq:mean-lin-s1-cont} \begin{split} \mathrm{E}(X + Y) &= \int_{\mathcal{X}} \int_{\mathcal{Y}} (x+y) \cdot f_{X,Y}(x,y) \, \mathrm{d}y \, \mathrm{d}x \\ &= \int_{\mathcal{X}} \int_{\mathcal{Y}} x \cdot f_{X,Y}(x,y) \, \mathrm{d}y \, \mathrm{d}x + \int_{\mathcal{X}} \int_{\mathcal{Y}} y \cdot f_{X,Y}(x,y) \, \mathrm{d}y \, \mathrm{d}x \\ &= \int_{\mathcal{X}} x \int_{\mathcal{Y}} f_{X,Y}(x,y) \, \mathrm{d}y \, \mathrm{d}x + \int_{\mathcal{Y}} y \int_{\mathcal{X}} f_{X,Y}(x,y) \, \mathrm{d}x \, \mathrm{d}y \\ &\overset{\eqref{eq:lmp-cont}}{=} \int_{\mathcal{X}} x \cdot f_X(x) \, \mathrm{d}x + \int_{\mathcal{Y}} y \cdot f_Y(y) \, \mathrm{d}y \\ &\overset{\eqref{eq:mean-cont}}{=} \mathrm{E}(X) + \mathrm{E}(Y) \end{split}\]

as well as

\[\label{eq:mean-lin-s2-cont} \begin{split} \mathrm{E}(a\,X) &= \int_{\mathcal{X}} a \, x \cdot f_X(x) \, \mathrm{d}x \\ &= a \int_{\mathcal{X}} x \cdot f_X(x) \, \mathrm{d}x \\ &\overset{\eqref{eq:mean-cont}}{=} a \, \mathrm{E}(X) \; . \end{split}\]


Collectively, this shows that both requirements for linearity are fulfilled for the expected value, for discrete as well as for continuous random variables. The present derivation also holds for the expected value of random vectors as well as for the expected value of random matrices.

Sources:

Metadata: ID: P53 | shortcut: mean-lin | author: JoramSoch | date: 2020-02-13, 21:08.