Index: The Book of Statistical ProofsGeneral TheoremsInformation theoryDifferential entropy ▷ Definition

Definition: Let $X$ be a continuous random variable with possible outcomes $\mathcal{X}$ and the (estimated or assumed) probability density function $p(x) = f_X(x)$. Then, the differential entropy (also referred to as “continuous entropy”) of $X$ is defined as

\[\label{eq:dent} \mathrm{h}(X) = - \int_{\mathcal{X}} p(x) \log_b p(x) \, \mathrm{d}x\]

where $b$ is the base of the logarithm specifying in which unit the entropy is determined.

 
Sources:

Metadata: ID: D16 | shortcut: dent | author: JoramSoch | date: 2020-02-19, 17:53.