Index: The Book of Statistical ProofsModel Selection ▷ Classical information criteria ▷ Bayesian information criterion ▷ Definition

Definition: Let $m$ be a generative model with likelihood function $p(y \vert \theta, m)$ and maximum likelihood estimates

\[\label{eq:MLE} \hat{\theta} = \operatorname*{arg\,max}_\theta \log p(y | \theta, m) \; .\]

Then, the Bayesian information criterion (BIC) of this model is defined as

\[\label{eq:BIC} \mathrm{BIC}(m) = -2 \log p(y | \hat{\theta}, m) + k \log n\]

where $n$ is the number of data points and $k$ is the number of free parameters estimated via \eqref{eq:MLE}.

 
Sources:

Metadata: ID: D24 | shortcut: bic | author: JoramSoch | date: 2020-02-25, 12:21.