Index: The Book of Statistical ProofsGeneral TheoremsEstimation theoryBasic concepts of estimation ▷ Biased vs. unbiased

Definition: Let $\hat{\theta}: \mathcal{Y} \rightarrow \Theta$ be an estimator of a parameter $\theta \in \Theta$ from data $y \in \mathcal{Y}$. Then,

  • $\hat{\theta}$ is called an unbiased estimator when its expected value is equal to the parameter that it is estimating: $\mathrm{E}_{\hat{\theta}}(\hat{\theta}) = \theta$, where the expectation is calculated over all possible samples $y$ leading to values of $\hat{\theta}$.

  • $\hat{\theta}$ is called a biased estimator otherwise, i.e. when $\mathrm{E}_{\hat{\theta}}(\hat{\theta}) \neq \theta$.

 
Sources:

Metadata: ID: D209 | shortcut: est-bias | author: JoramSoch | date: 2024-11-08, 10:53.