Index: The Book of Statistical ProofsGeneral Theorems ▷ Bayesian statistics ▷ Bayesian inference ▷ Variational Bayes

Definition: Let $m$ be a generative model with model parameters $\theta$ implying the likelihood function $p(y \vert \theta, m)$ and prior distribution $p(\theta \vert m)$. Then, a Variational Bayes treatment of $m$, also referred to as “approximate inference” or “variational inference”, consists in


1) constructing an approximate posterior distribution

\[\label{eq:post-vb} q(\theta) \approx p(\theta \vert y, m) \; ,\]


2) evaluating the variational free energy

\[\label{eq:FE} F_q(m) = \int q(\theta) \log p(y|\theta,m) \, \mathrm{d}\theta - \int q(\theta) \frac{q(\theta)}{p(\theta|m)} \, \mathrm{d}\theta\]


3) and maximizing this function with respect to $q(\theta)$

\[\label{eq:VB} \hat{q}(\theta) = \operatorname*{arg\,max}_{q} F_q(m) \; .\]

for Bayesian inference, i.e. obtaining the posterior distribution (from eq. \eqref{eq:VB}) and approximating the marginal likelihood (by plugging eq. \eqref{eq:VB} into eq. \eqref{eq:FE}).

 
Sources:

Metadata: ID: D150 | shortcut: vb | author: JoramSoch | date: 2021-04-29, 07:15.