\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Mutually Independent Events
Needed by:
None.
Links:
Sheet PDF
Graph PDF

Pairwise Independent Events

Why

Is a sufficient condition for mutual independence of a set of events the fact that each pair of events of the set are independent? The surprising answer is no.

Definition

Suppose $P$ is a event probability function on a finite sample space $\Omega $. The events $A_1, \dots , A_n$ are independent (or mutually independent), if for all indices $i \neq j$.

\[ P(A_{i} \cap A_{j}) = P(A_{i}) P(A_{j}) \]

Clearly, if $A_1, \dots , A_n$ are mutually independent, then they are pairwise independent. The converse, however, is not true.

Counterexample: two tosses

As usual, model tossing a coin twice with the sample space $\Omega = \set{0,1}^2$. Put a distribution $p: \Omega \to [0,1]$ so that $p(\omega ) = 1/4$ for all $\omega \in \Omega $. Define $A = \set{(1,0),(1,1)}$ (the first toss is heads) , $B = \set{(0,1),(1,1)}$ (the second toss is heads), and $C = \set{(0,0),(1,1)}$. Then $P(A) = P(B) = P(C) = 1/2$, and

\[ P(A \cap B) = P(B \cap C) = P(A \cap C) = 1/4. \]

Hence $A, B, C$ are pairwise independent. However,

\[ P(A \cap B \cap C) = 1/4 \neq P(A)P(B)P(C). \]

So $A, B, C$ are not mutually independent.

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view