Definition: Joint entropy
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Information theory ▷
Shannon entropy ▷
Joint entropy
Sources:
Metadata: ID: D18 | shortcut: ent-joint | author: JoramSoch | date: 2020-02-19, 18:18.
Definition: Let and Y be discrete random variables with possible outcomes \mathcal{X} and \mathcal{Y} and joint probability mass function p(x,y). Then, the joint entropy of X and Y is defined as
\label{eq:ent-joint} \mathrm{H}(X,Y) = - \sum_{x \in \mathcal{X}} \sum_{y \in \mathcal{Y}} p(x,y) \cdot \log_b p(x,y)where b is the base of the logarithm specifying in which unit the entropy is determined.
- Cover TM, Thomas JA (1991): "Joint Entropy and Conditional Entropy"; in: Elements of Information Theory, ch. 2.2, p. 16; URL: https://www.wiley.com/en-us/Elements+of+Information+Theory%2C+2nd+Edition-p-9780471241959.
Metadata: ID: D18 | shortcut: ent-joint | author: JoramSoch | date: 2020-02-19, 18:18.