Index: The Book of Statistical ProofsModel SelectionBayesian model selectionBayes factor ▷ Derivation of the log Bayes factor

Theorem: Let there be two generative models $m_1$ and $m_2$ with model evidences $p(y \vert m_1)$ and $p(y \vert m_2)$. Then, the log Bayes factor

\[\label{eq:LBF-term} \mathrm{LBF}_{12} = \log \mathrm{BF}_{12}\]

can be expressed as

\[\label{eq:LBF-ratio} \mathrm{LBF}_{12} = \log \frac{p(y|m_1)}{p(y|m_2)} \; .\]

Proof: The Bayes factor is defined as the posterior odds ratio when both models are equally likely apriori:

\[\label{eq:BF-s1} \mathrm{BF}_{12} = \frac{p(m_1|y)}{p(m_2|y)}\]

Plugging in the posterior odds ratio according to Bayes’ rule, we have

\[\label{eq:BF-s2} \mathrm{BF}_{12} = \frac{p(y|m_1)}{p(y|m_2)} \cdot \frac{p(m_1)}{p(m_2)} \; .\]

When both models are equally likely apriori, the prior odds ratio is one, such that

\[\label{eq:BF-s3} \mathrm{BF}_{12} = \frac{p(y|m_1)}{p(y|m_2)} \; .\]

Equation \eqref{eq:LBF-ratio} follows by logarithmizing both sides of \eqref{eq:BF-s3}.

Sources:

Metadata: ID: P137 | shortcut: lbf-der | author: JoramSoch | date: 2020-07-22, 07:27.