Index: The Book of Statistical ProofsModel Selection ▷ Bayesian model selection ▷ Model evidence ▷ Cross-validated log model evidence

Definition: Let there be a data set $y$ with mutually exclusive and collectively exhaustive subsets $y_1, \ldots, y_S$. Assume a generative model $m$ with model parameters $\theta$ implying a likelihood function $p(y \vert \theta, m)$ and a non-informative prior density $p_{\mathrm{ni}}(\theta \vert m)$.

Then, the cross-validated log model evidence of $m$ is given by

\[\label{eq:cvLME} \mathrm{cvLME}(m) = \sum_{i=1}^{S} \log \int p( y_i \vert \theta, m ) \, p( \theta \vert y_{\neg i}, m ) \, \mathrm{d}\theta\]

where $y_{\neg i} = \bigcup_{j \neq i} y_j$ is the union of all data subsets except $y_i$ and $p( \theta \vert y_{\neg i}, m )$ is the posterior distribution obtained from $y_{\neg i}$ when using the prior distribution $p_{\mathrm{ni}}(\theta \vert m)$:

\[\label{eq:post} p( \theta \vert y_{\neg i}, m ) = \frac{p( y_{\neg i} \vert \theta, m ) \, p_{\mathrm{ni}}(\theta \vert m)}{p( y_{\neg i} \vert m )} \; .\]
 
Sources:

Metadata: ID: D111 | shortcut: cvlme | author: JoramSoch | date: 2020-11-19, 04:55.