Index: The Book of Statistical ProofsGeneral TheoremsBayesian statisticsBayesian inference ▷ Empirical Bayes

Definition: Let $m$ be a generative model with model parameters $\theta$ and hyper-parameters $\lambda$ implying the likelihood function $p(y \vert \theta, \lambda, m)$ and prior distribution $p(\theta \vert \lambda, m)$. Then, an Empirical Bayes treatment of $m$, also referred to as “type II maximum likelihood” or “evidence approximation”, consists in


1) evaluating the marginal likelihood of the model $m$

\[\label{eq:ML} p(y \vert \lambda, m) = \int p(y \vert \theta, \lambda, m) \, (\theta \vert \lambda, m) \, \mathrm{d}\theta \; ,\]


2) maximizing the log model evidence with respect to $\lambda$

\[\label{eq:EB} \hat{\lambda} = \operatorname*{arg\,max}_{\lambda} \log p(y \vert \lambda, m)\]


3) and using the prior distribution at this maximum

\[\label{eq:prior-eb} p(\theta \vert m) = p(\theta \vert \hat{\lambda}, m)\]

for Bayesian inference, i.e. obtaining the posterior distribution and computing the marginal likelihood.

 
Sources:

Metadata: ID: D149 | shortcut: eb | author: JoramSoch | date: 2021-04-29, 06:46.