Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryExpected value ▷ Weak law of large numbers

Theorem: Let $X_1, \ldots, X_n$ be independent and identically distributed random variables with expected value $\mathrm{E}(X_i) = \mu$ and finite variance $\mathrm{Var}(X_i) < \infty$ for $i = 1,\ldots,n$. The sample mean is defined as

\[\label{eq:mean-samp} \bar{X} = \frac{1}{n} \sum_{i=1}^{n} X_i \; .\]

Then, for any positive number $\epsilon > 0$, the probability that the absolute difference of the sample mean from the expected value $\mu$ is smaller than $\epsilon$ will approach one, as sample size goes to infinity:

\[\label{eq:mean-wlln} \lim_{n \rightarrow \infty} \mathrm{Pr}\left( \left| \bar{X} - \mu \right| < \epsilon \right) = 1 \; .\]

Proof: Since $X_1, \ldots, X_n$ are independent and identically distributed, they have the same mean, denoted as $\mu$, and the same variance, denoted as $\sigma^2$. Using the linearity of the expected value, the expected value of the sample mean becomes:

\[\label{eq:mean-samp-mean} \begin{split} \mathrm{E}\left( \bar{X} \right) &= \mathrm{E}\left( \frac{1}{n} \sum_{i=1}^{n} X_i \right) \\ &= \frac{1}{n} \sum_{i=1}^{n} \mathrm{E}\left( X_i \right) \\ &= \frac{1}{n} n \mathrm{E}\left( X_i \right) \\ &= \mu \; . \end{split}\]

Moreover, with the scaling of the variance upon multiplication and the additivity of the variance under independence, the variance of the sample mean becomes:

\[\label{eq:mean-samp-var} \begin{split} \mathrm{Var}\left( \bar{X} \right) &= \mathrm{Var}\left( \frac{1}{n} \sum_{i=1}^{n} X_i \right) \\ &= \frac{1}{n^2} \mathrm{Var}\left( \sum_{i=1}^{n} X_i \right) \\ &= \frac{1}{n^2} \sum_{i=1}^{n} \mathrm{Var}\left( X_i \right) \\ &= \frac{1}{n^2} n \sigma^2 \\ &= \frac{\sigma^2}{n} \; . \end{split}\]

Chebyshev’s inequality makes a statement about a random variable $X$ in relation to its mean and variance for any positive number $x > 0$:

\[\label{eq:cheb-ineq} \mathrm{Pr}\left( \left| X - \mathrm{E}(\bar{X}) \right| \geq x \right) = \frac{\mathrm{Var}(X)}{x} \; .\]

Applying this inequality to the random variable $\bar{X}$, we have:

\[\label{eq:mean-wlln-s1} \begin{split} \mathrm{Pr}\left( \left| \bar{X} - \mathrm{E}(\bar{X}) \right| \geq x \right) &= \frac{\mathrm{Var}(\bar{X})}{x} \\ \mathrm{Pr}\left( \left| \bar{X} - \mu \right| \geq \epsilon \right) &= \frac{\sigma^2}{n \epsilon} \; . \end{split}\]

Since the cumulative distribution function can be used to relate probabilities of inverse events, i.e. $\mathrm{Pr}\left( X \geq x \right) = 1 - \mathrm{Pr}\left( X < x \right)$, we have:

\[\label{eq:mean-wlln-s2} \begin{split} 1 - \mathrm{Pr}\left( \left| \bar{X} - \mu \right| < \epsilon \right) &= \frac{\sigma^2}{n \epsilon} \\ \mathrm{Pr}\left( \left| \bar{X} - \mu \right| < \epsilon \right) &= 1 - \frac{\sigma^2}{n \epsilon} \; . \end{split}\]

Now taking the limit for $n \rightarrow \infty$ on both sides, while considering that $\epsilon$ and $\sigma^2$ are finite, gives:

\[\label{eq:mean-wlln-s3} \begin{split} \lim_{n \rightarrow \infty} \mathrm{Pr}\left( \left| \bar{X} - \mu \right| < \epsilon \right) &= \lim_{n \rightarrow \infty} \left( 1 - \frac{\sigma^2}{n \epsilon} \right) \\ &= 1 - \lim_{n \rightarrow \infty} \frac{\sigma^2 / \epsilon}{n} \\ &= 1 \; . \end{split}\]
Sources:

Metadata: ID: P468 | shortcut: mean-wlln | author: JoramSoch | date: 2024-09-13, 11:02.