Definition: Joint differential entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Differential entropy ▷
Joint differential entropy
Sources:
Metadata: ID: D35 | shortcut: dent-joint | author: JoramSoch | date: 2020-03-21, 12:37.
Definition: Let $X$ and $Y$ be continuous random variables with possible outcomes $\mathcal{X}$ and $\mathcal{Y}$ and joint probability density function $p(x,y)$. Then, the joint differential entropy of $X$ and $Y$ is defined as
\[\label{eq:dent-joint} \mathrm{h}(X,Y) = - \int_{x \in \mathcal{X}} \int_{y \in \mathcal{Y}} p(x,y) \cdot \log_b p(x,y) \, \mathrm{d}y \, \mathrm{d}x\]where $b$ is the base of the logarithm specifying in which unit the differential entropy is determined.
Metadata: ID: D35 | shortcut: dent-joint | author: JoramSoch | date: 2020-03-21, 12:37.