Index: The Book of Statistical ProofsGeneral TheoremsProbability theoryRandom variables ▷ Discrete vs. continuous

Definition: Let $X$ be a random variable with possible outcomes $\mathcal{X}$. Then,

  • $X$ is called a discrete random variable, if $\mathcal{X}$ is either a finite set or a countably infinite set; in this case, $X$ can be described by a probability mass function;

  • $X$ is called a continuous random variable, if $\mathcal{X}$ is an uncountably infinite set; if it is absolutely continuous, $X$ can be described by a probability density function.

 
Sources:

Metadata: ID: D105 | shortcut: rvar-disc | author: JoramSoch | date: 2020-10-29, 04:44.