Index: The Book of Statistical ProofsModel SelectionBayesian model selectionModel evidence ▷ Subtraction of mean from LMEs

Theorem: Subtracting the arithmetic mean from a set of log model evidences is equivalent to dividing the corresponding model evidences by their geometric mean.

Proof: Consider a model space $\mathcal{M} = \left\lbrace m_1, \ldots, m_M \right\rbrace$ consisting of $M$ models. Then, the normalized log model evidence of any model $m_i$, denoted as $\mathrm{LME}^{*}(m_i)$, may be calculated by subtracting the mean across model space:

\[\label{eq:lme-norm} \mathrm{LME}^{*}(m_i) = \log p(y|m_i) - \frac{1}{M} \sum_{j=1}^M \log p(y|m_j) \; .\]

To prove the theorem, we will now rewrite the right-hand side until we arrive at an expression for the normalized model evidence. First, applying $c \log_b a = \log_b a^c$, we obtain

\[\label{eq:lme-mean-s1} \mathrm{LME}^{*}(m_i) = \log p(y|m_i) - \sum_{j=1}^M \left[ \log p(y|m_j)^{1/M} \right] \; .\]

Then, exponentiating both sides, we have:

\[\label{eq:lme-mean-s2} \begin{split} \mathrm{exp}\left[ \mathrm{LME}^{*}(m_i) \right] &= \frac{\mathrm{exp}\left[ \log p(y|m_i) \right]}{\mathrm{exp}\left[ \sum_{j=1}^M \left[ \log p(y|m_j)^{1/M} \right] \right]} \\ &= \frac{p(y|m_i)}{\prod_{j=1}^M \mathrm{exp}\left[ \log p(y|m_j)^{1/M} \right]} \\ &= \frac{p(y|m_i)}{\prod_{j=1}^M p(y|m_j)^{1/M}} \\ &= \frac{p(y|m_i)}{\left( \prod_{j=1}^M p(y|m_j) \right)^{1/M}} \\ &= \frac{p(y|m_i)}{\sqrt[M]{\prod_{j=1}^M p(y|m_j)}} \; . \end{split}\]

Finally, the right-hand side is equal to ratio of $m_i$’s model evidence to the geometric mean of all model evidences.

Sources:

Metadata: ID: P414 | shortcut: lme-mean | author: JoramSoch | date: 2023-09-08, 11:56.