Definition: Bayesian information criterion
Index:
The Book of Statistical Proofs ▷
Model Selection ▷
Classical information criteria ▷
Bayesian information criterion ▷
Definition
Sources:
Metadata: ID: D24 | shortcut: bic | author: JoramSoch | date: 2020-02-25, 12:21.
Definition: Let $m$ be a generative model with likelihood function $p(y \vert \theta, m)$ and maximum likelihood estimates
\[\label{eq:MLE} \hat{\theta} = \operatorname*{arg\,max}_\theta \log p(y | \theta, m) \; .\]Then, the Bayesian information criterion (BIC) of this model is defined as
\[\label{eq:BIC} \mathrm{BIC}(m) = -2 \log p(y | \hat{\theta}, m) + k \log n\]where $n$ is the number of data points and $k$ is the number of free parameters estimated via \eqref{eq:MLE}.
- Schwarz G (1978): "Estimating the Dimension of a Model"; in: The Annals of Statistics, vol. 6, no. 2, pp. 461-464; URL: https://www.jstor.org/stable/2958889.
Metadata: ID: D24 | shortcut: bic | author: JoramSoch | date: 2020-02-25, 12:21.