Index: The Book of Statistical ProofsModel Selection ▷ Bayesian model selection ▷ Posterior model probability ▷ Calculation from log Bayes factor

Theorem: Let $m_1$ and $m_2$ be two statistical models with the log Bayes factor $\mathrm{LBF}_{12}$ in favor of model $m_1$ and against model $m_2$. Then, if both models are equally likely apriori, the posterior model probability of $m_1$ is

\[\label{eq:PMP-LBF} p(m_1|y) = \frac{\exp(\mathrm{LBF}_{12})}{\exp(\mathrm{LBF}_{12}) + 1} \; .\]

Proof: From Bayes’ rule, the posterior odds ratio is

\[\label{eq:post-odds-s1} \frac{p(m_1|y)}{p(m_2|y)} = \frac{p(y|m_1)}{p(y|m_2)} \cdot \frac{p(m_1)}{p(m_2)} \; .\]

When both models are equally likely apriori, the prior odds ratio is one, such that

\[\label{eq:post-odds-s2} \frac{p(m_1|y)}{p(m_2|y)} = \frac{p(y|m_1)}{p(y|m_2)} \; .\]

Now the right-hand side corresponds to the Bayes factor, therefore

\[\label{eq:post-odds-s4} \frac{p(m_1|y)}{p(m_2|y)} = \mathrm{BF}_{12} \; .\]

Because the two posterior model probabilities add up to 1, we have

\[\label{eq:post-odds-s3} \frac{p(m_1|y)}{1-p(m_1|y)} = \mathrm{BF}_{12} \; .\]

Now rearranging for the posterior probability, this gives

\[\label{eq:post-s1} p(m_1|y) = \frac{\mathrm{BF}_{12}}{\mathrm{BF}_{12} + 1} \; .\]

Because the log Bayes factor is the logarithm of the Bayes factor, we finally have

\[\label{eq:post-s2} p(m_1|y) = \frac{\exp(\mathrm{LBF}_{12})}{\exp(\mathrm{LBF}_{12}) + 1} \; .\]
Sources:

Metadata: ID: P73 | shortcut: pmp-lbf | author: JoramSoch | date: 2020-03-03, 12:27.