Proof: Probability mass function of an invertible function of a random vector
Index:
The Book of Statistical Proofs ▷
General Theorems ▷
Probability theory ▷
Probability mass function ▷
Probability mass function of invertible function
Metadata: ID: P253 | shortcut: pmf-invfct | author: JoramSoch | date: 2021-08-30, 05:13.
Theorem: Let $X$ be an $n \times 1$ random vector of discrete random variables with possible outcomes $\mathcal{X}$ and let $g: \; \mathbb{R}^n \rightarrow \mathbb{R}^n$ be an invertible function on the support of $X$. Then, the probability mass function of $Y = g(X)$ is given by
\[\label{eq:pmf-invfct} f_Y(y) = \left\{ \begin{array}{rl} f_X(g^{-1}(y)) \; , & \text{if} \; y \in \mathcal{Y} \\ 0 \; , & \text{if} \; y \notin \mathcal{Y} \end{array} \right.\]where $g^{-1}(y)$ is the inverse function of $g(x)$ and $\mathcal{Y}$ is the set of possible outcomes of $Y$:
\[\label{eq:Y-range} \mathcal{Y} = \left\lbrace y = g(x): x \in \mathcal{X} \right\rbrace \; .\]Proof: Because an invertible function is a one-to-one mapping, the probability mass function of $Y$ can be derived as follows:
\[\label{eq:pmf-invfct-qed} \begin{split} f_Y(y) &= \mathrm{Pr}(Y = y) \\ &= \mathrm{Pr}(g(X) = y) \\ &= \mathrm{Pr}(X = g^{-1}(y)) \\ &= f_X(g^{-1}(y)) \; . \end{split}\]∎
Sources: - Taboga, Marco (2017): "Functions of random vectors and their distribution"; in: Lectures on probability and mathematical statistics, retrieved on 2021-08-30; URL: https://www.statlect.com/fundamentals-of-probability/functions-of-random-vectors.
Metadata: ID: P253 | shortcut: pmf-invfct | author: JoramSoch | date: 2021-08-30, 05:13.