Index: The Book of Statistical ProofsGeneral Theorems ▷ Bayesian statistics ▷ Probabilistic modeling ▷ Joint likelihood is product of likelihood and prior

Theorem: Let there be a generative model $m$ describing measured data $y$ using model parameters $\theta$ and a prior distribution on $\theta$. Then, the joint likelihood is equal to the product of likelihood function and prior density:

\[\label{eq:jl} p(y,\theta|m) = p(y|\theta,m) \, p(\theta|m) \; .\]

Proof: The joint likelihood is defined as the joint probability distribution of data $y$ and parameters $\theta$:

\[\label{eq:jl-def} p(y,\theta|m) \; .\]

Applying the law of conditional probability, we have:

\[\label{eq:jl-qed} \begin{split} p(y|\theta,m) &= \frac{p(y,\theta|m)}{p(\theta|m)} \\ &\Leftrightarrow \\ p(y,\theta|m) &= p(y|\theta,m) \, p(\theta|m) \; . \end{split}\]
Sources:

Metadata: ID: P89 | shortcut: jl-lfnprior | author: JoramSoch | date: 2020-05-05, 04:21.