Index: The Book of Statistical ProofsModel SelectionClassical information criteriaDeviance information criterion ▷ Definition

Definition: Let $m$ be a full probability model with likelihood function $p(y \vert \theta, m)$ and prior distribution $p(\theta \vert m)$. Together, likelihood function and prior distribution imply a posterior distribution $p(\theta \vert y, m)$. Consider the deviance which is minus two times the log-likelihood function:

\[\label{eq:dev} D(\theta) = -2 \log p(y|\theta,m) \; .\]

Then, the deviance information criterion (DIC) of the model $m$ is defined as

\[\label{eq:DIC} \mathrm{DIC}(m) = -2 \log p(y|\left\langle \theta \right\rangle, m) + 2 \, p_D\]

where $\log p(y \vert \left\langle \theta \right\rangle, m)$ is the log-likelihood function at the posterior expectation and the “effective number of parameters” $p_D$ is the difference between the expectation of the deviance and the deviance at the expectation:

\[\label{eq:DIC-pD} p_D = \left\langle D(\theta) \right\rangle - D(\left\langle \theta \right\rangle) \; .\]

In these equations, $\left\langle \cdot \right\rangle$ denotes expected values across the posterior distribution.

 
Sources:

Metadata: ID: D25 | shortcut: dic | author: JoramSoch | date: 2020-02-25, 12:46.