Index: The Book of Statistical ProofsGeneral Theorems ▷ Bayesian statistics ▷ Probabilistic modeling ▷ Joint likelihood

Definition: Let there be a generative model $m$ describing measured data $y$ using model parameters $\theta$ and a prior distribution on $\theta$. Then, the joint probability distribution of $y$ and $\theta$ is called the joint likelihood:

\[\label{eq:jl} p(y,\theta|m) = p(y|\theta,m) \, p(\theta|m) \; .\]
 
Sources:

Metadata: ID: D31 | shortcut: jl | author: JoramSoch | date: 2020-03-03, 16:36.