Index: The Book of Statistical ProofsModel Selection ▷ Bayesian model selection ▷ Posterior model probability ▷ Calculation from log model evidences

Theorem: Let $m_1, \ldots, m_M$ be $M$ statistical models with log model evidences $\mathrm{LME}(m_1), \ldots, \mathrm{LME}(m_M)$. Then, the posterior model probabilities are given by:

\[\label{eq:PMP-LME} p(m_i|y) = \frac{\exp[\mathrm{LME}(m_i)] \, p(m_i)}{\sum_{j=1}^{M} \exp[\mathrm{LME}(m_j)] \, p(m_j)}, \quad i = 1,\ldots,M \; ,\]

where $p(m_i)$ are prior model probabilities.

Proof: The posterior model probability can be derived as

\[\label{eq:PMP-s1} p(m_i|y) = \frac{p(y|m_i) \, p(m_i)}{\sum_{j=1}^{M} p(y|m_j) \, p(m_j)} \; .\]

The definition of the log model evidence

\[\label{eq:LME} \mathrm{LME}(m) = \log p(y|m)\]

can be exponentiated to then read

\[\label{eq:ME} \exp\left[ \mathrm{LME}(m) \right] = p(y|m)\]

and applying \eqref{eq:ME} to \eqref{eq:PMP-s1}, we finally have:

\[\label{eq:PMP-s2} p(m_i|y) = \frac{\exp[\mathrm{LME}(m_i)] \, p(m_i)}{\sum_{j=1}^{M} \exp[\mathrm{LME}(m_j)] \, p(m_j)} \; .\]
Sources:

Metadata: ID: P66 | shortcut: pmp-lme | author: JoramSoch | date: 2020-02-27, 21:33.