Definition: Biased vs. unbiased estimator
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Estimation theory ▷
Basic concepts of estimation ▷
Biased vs. unbiased
Sources:
Metadata: ID: D209 | shortcut: est-bias | author: JoramSoch | date: 2024-11-08, 10:53.
Definition: Let $\hat{\theta}: \mathcal{Y} \rightarrow \Theta$ be an estimator of a parameter $\theta \in \Theta$ from data $y \in \mathcal{Y}$. Then,
-
$\hat{\theta}$ is called an unbiased estimator when its expected value is equal to the parameter that it is estimating: $\mathrm{E}_{\hat{\theta}}(\hat{\theta}) = \theta$, where the expectation is calculated over all possible samples $y$ leading to values of $\hat{\theta}$.
-
$\hat{\theta}$ is called a biased estimator otherwise, i.e. when $\mathrm{E}_{\hat{\theta}}(\hat{\theta}) \neq \theta$.
- Wikipedia (2024): "Estimator"; in: Wikipedia, the free encyclopedia, retrieved on 2024-11-08; URL: https://en.wikipedia.org/wiki/Estimator#Bias.
Metadata: ID: D209 | shortcut: est-bias | author: JoramSoch | date: 2024-11-08, 10:53.