Index: The Book of Statistical ProofsModel SelectionBayesian model selectionFamily evidence ▷ Calculation from log model evidences

Theorem: Let $m_1, \ldots, m_M$ be $M$ statistical models with log model evidences $\mathrm{LME}(m_1), \ldots, \mathrm{LME}(m_M)$ and belonging to $F$ mutually exclusive model families $f_1, \ldots, f_F$. Then, the log family evidences are given by:

\[\label{eq:LFE-LME} \mathrm{LFE}(f_j) = \log \sum_{m_i \in f_j} \left[ \exp[\mathrm{LME}(m_i)] \cdot p(m_i|f_j) \right], \quad j = 1, \ldots, F,\]

where $p(m_i \vert f_j)$ are within-family prior model probabilities.

Proof: Let us consider the (unlogarithmized) family evidence $p(y \vert f_j)$. According to the law of marginal probability, this conditional probability is given by

\[\label{eq:FE-ME-s1} p(y|f_j) = \sum_{m_i \in f_j} \left[ p(y|m_i,f_j) \cdot p(m_i|f_j) \right] \; .\]

Because model families are mutually exclusive, it holds that $p(y \vert m_i,f_j) = p(y \vert m_i)$, such that

\[\label{eq:FE-ME-s2} p(y|f_j) = \sum_{m_i \in f_j} \left[ p(y|m_i) \cdot p(m_i|f_j) \right] \; .\]

Logarithmizing transforms the family evidence $p(y \vert f_j)$ into the log family evidence $\mathrm{LFE}(f_j)$:

\[\label{eq:LFE-LME-s1} \mathrm{LFE}(f_j) = \log \sum_{m_i \in f_j} \left[ p(y|m_i) \cdot p(m_i|f_j) \right] \; .\]

The definition of the log model evidence

\[\label{eq:LME} \mathrm{LME}(m) = \log p(y|m)\]

can be exponentiated to then read

\[\label{eq:ME} \exp\left[ \mathrm{LME}(m) \right] = p(y|m)\]

and applying \eqref{eq:ME} to \eqref{eq:LFE-LME-s1}, we finally have:

\[\label{eq:LFE-LME-s2} \mathrm{LFE}(f_j) = \log \sum_{m_i \in f_j} \left[ \exp[\mathrm{LME}(m_i)] \cdot p(m_i|f_j) \right] \; .\]
Sources:

Metadata: ID: P65 | shortcut: lfe-lme | author: JoramSoch | date: 2020-02-27, 21:16.