Index: The Book of Statistical ProofsProbability DistributionsMultivariate discrete distributionsMultinomial distribution ▷ Marginal distributions

Theorem: Let $X$ be a random vector following a multinomial distribution:

\[\label{eq:mult} X \sim \mathrm{Mult}(n,\left[p_1, \ldots, p_k \right]) \; .\]

Then, the marginal distribution of any entry $X_i$ is a binomial distribution:

\[\label{eq:mult-marg} X_i \sim \mathrm{Bin}(n, p_i) \quad \text{for all} \quad i = 1, \ldots, k \; .\]

Proof: The entries of a multinomial random vector $X$ are the numbers of observations belonging to $k$ distinct categories in $n$ independent trials where $p_1, \ldots, p_k$ are the category probabilities identical across trials.

Let us define for each category $i = 1, \ldots, k$ and each trial $j = 1, \ldots, n$:

\[\label{eq:Y-ij} Y_{ij} = \left\{ \begin{array}{rl} 0 \; , & \text{if} \; X_i = 0 \; \text{in trial} \; j \\ 1 \; , & \text{if} \; X_i = 1 \; \text{in trial} \; j \; . \end{array} \right.\]

Since $\mathrm{Pr}(Y_{ij} = 1) = p_i$ and $\mathrm{Pr}(Y_{ij} = 0) = 1-p_i$, $Y_{ij}$ is a Bernoulli random variable with success probability $p_i$:

\[\label{eq:Y-ij-dist} Y_{ij} \sim \mathrm{Bern}(p_i) \; .\]

Moreover, each $X_i$ is the sum of the corresponding $Y_{ij}$ over all trials

\[\label{eq:X-i-sum} X_i = \sum_{j=1}^n Y_{ij} \; ,\]

i.e. $X_i$ is a sum of independent Bernoulli random variables which, by definition, follows a binomial distribution:

\[\label{eq:X-i-dist} X_i \sim \mathrm{Bin}(n, p_i) \; .\]
Sources:

Metadata: ID: P485 | shortcut: mult-marg | author: JoramSoch | date: 2025-02-06, 10:12.