# Definition: Joint differential entropy

**Index:**The Book of Statistical Proofs ▷ General Theorems ▷ Information theory ▷ Differential entropy ▷ Joint differential entropy

**Definition:** Let $X$ and $Y$ be continuous random variables with possible outcomes $\mathcal{X}$ and $\mathcal{Y}$ and joint probability density function $p(x,y)$. Then, the joint differential entropy of $X$ and $Y$ is defined as

where $b$ is the base of the logarithm specifying in which unit the differential entropy is determined.

**Sources:**

**Metadata:**ID: D35 | shortcut: dent-joint | author: JoramSoch | date: 2020-03-21, 12:37.