Index: The Book of Statistical ProofsGeneral Theorems ▷ Probability theory ▷ Probability ▷ Probability under independence

Theorem: Let $A$ and $B$ be two statements about random variables. Then, if $A$ and $B$ are independent, marginal and conditional probabilities are equal:

\[\label{eq:prob-ind} \begin{split} p(A) &= p(A|B) \\ p(B) &= p(B|A) \; . \end{split}\]

Proof: If $A$ and $B$ are independent, then the joint probability is equal to the product of the marginal probabilities:

\[\label{eq:ind} p(A,B) = p(A) \cdot p(B) \; .\]

The law of conditional probability states that

\[\label{eq:prob-cond} p(A|B) = \frac{p(A,B)}{p(B)} \; .\]

Combining \eqref{eq:ind} and \eqref{eq:prob-cond}, we have:

\[\label{eq:prob-ind-qed-A} p(A|B) = \frac{p(A) \cdot p(B)}{p(B)} = p(A) \; .\]

Equivalently, we can write:

\[\label{eq:prob-ind-qed-B} p(B|A) \overset{\eqref{eq:prob-cond}}{=} \frac{p(A,B)}{p(A)} \overset{\eqref{eq:ind}}{=} \frac{p(A) \cdot p(B)}{p(A)} = p(B) \; .\]

Metadata: ID: P241 | shortcut: prob-ind | author: JoramSoch | date: 2021-07-23, 16:05.