Proof: Posterior model probabilities in terms of log model evidences
Index:
The Book of Statistical Proofs ▷
Model Selection ▷
Bayesian model selection ▷
Posterior model probability ▷
Calculation from log model evidences
Metadata: ID: P66 | shortcut: pmp-lme | author: JoramSoch | date: 2020-02-27, 21:33.
Theorem: Let $m_1, \ldots, m_M$ be $M$ statistical models with log model evidences $\mathrm{LME}(m_1), \ldots, \mathrm{LME}(m_M)$. Then, the posterior model probabilities are given by:
\[\label{eq:PMP-LME} p(m_i|y) = \frac{\exp[\mathrm{LME}(m_i)] \, p(m_i)}{\sum_{j=1}^{M} \exp[\mathrm{LME}(m_j)] \, p(m_j)}, \quad i = 1,\ldots,M \; ,\]where $p(m_i)$ are prior model probabilities.
Proof: The posterior model probability can be derived as
\[\label{eq:PMP-s1} p(m_i|y) = \frac{p(y|m_i) \, p(m_i)}{\sum_{j=1}^{M} p(y|m_j) \, p(m_j)} \; .\]The definition of the log model evidence
\[\label{eq:LME} \mathrm{LME}(m) = \log p(y|m)\]can be exponentiated to then read
\[\label{eq:ME} \exp\left[ \mathrm{LME}(m) \right] = p(y|m)\]and applying \eqref{eq:ME} to \eqref{eq:PMP-s1}, we finally have:
\[\label{eq:PMP-s2} p(m_i|y) = \frac{\exp[\mathrm{LME}(m_i)] \, p(m_i)}{\sum_{j=1}^{M} \exp[\mathrm{LME}(m_j)] \, p(m_j)} \; .\]∎
Sources: - Soch J, Allefeld C (2018): "MACS – a new SPM toolbox for model assessment, comparison and selection"; in: Journal of Neuroscience Methods, vol. 306, pp. 19-31, eq. 23; URL: https://www.sciencedirect.com/science/article/pii/S0165027018301468; DOI: 10.1016/j.jneumeth.2018.05.017.
Metadata: ID: P66 | shortcut: pmp-lme | author: JoramSoch | date: 2020-02-27, 21:33.