Index: The Book of Statistical ProofsGeneral TheoremsBayesian statisticsBayesian inference ▷ Variational Bayes

Definition: Let $m$ be a generative model with model parameters $\theta \in \Theta$ implying the likelihood function $p(y \vert \theta, m)$ and prior distribution $p(\theta \vert m)$. Then, a Variational Bayes treatment of $m$, also referred to as “approximate inference” or “variational inference”, consists in


1) constructing an approximate posterior distribution

\[\label{eq:post-vb} q(\theta) \approx p(\theta \vert y, m) \; ,\]


2) evaluating the variational free energy

\[\label{eq:FE} \mathrm{F}_m[q(\theta)] = \int_{\Theta} q(\theta) \log \frac{p(\theta \vert y, m)}{q(\theta)} \, \mathrm{d}\theta\]


3) and maximizing this function with respect to $q(\theta)$

\[\label{eq:VB} \hat{q}(\theta) = \operatorname*{arg\,max}_{q} \mathrm{F}_m[q(\theta)]\]

for Bayesian inference, i.e. obtaining the posterior distribution (from eq. \eqref{eq:VB}) and approximating the marginal likelihood (by plugging eq. \eqref{eq:VB} into eq. \eqref{eq:FE}).

 
Sources:

Metadata: ID: D150 | shortcut: vb | author: JoramSoch | date: 2021-04-29, 07:15.