Definition: Differential entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Differential entropy ▷
Definition
Sources:
Metadata: ID: D16 | shortcut: dent | author: JoramSoch | date: 2020-02-19, 17:53.
Definition: Let $X$ be a continuous random variable with possible outcomes $\mathcal{X}$ and the (estimated or assumed) probability density function $p(x) = f_X(x)$. Then, the differential entropy (also referred to as “continuous entropy”) of $X$ is defined as
\[\label{eq:dent} \mathrm{h}(X) = - \int_{\mathcal{X}} p(x) \log_b p(x) \, \mathrm{d}x\]where $b$ is the base of the logarithm specifying in which unit the entropy is determined.
- Cover TM, Thomas JA (1991): "Differential Entropy"; in: Elements of Information Theory, ch. 8.1, p. 243; URL: https://www.wiley.com/en-us/Elements+of+Information+Theory%2C+2nd+Edition-p-9780471241959.
Metadata: ID: D16 | shortcut: dent | author: JoramSoch | date: 2020-02-19, 17:53.