Independence of events

A condition ensuring knowledge of one event does not change the probability of another
Independence of events

Two events are independent if, on a (Ω,F,P)(\Omega,\mathcal F,\mathbb P), they satisfy

P(AB)  =  P(A)P(B). \mathbb P(A\cap B)\;=\;\mathbb P(A)\,\mathbb P(B).

A finite or countable family of events (Ai)iI(A_i)_{i\in I} is independent if for every finite subset {i1,,in}I\{i_1,\dots,i_n\}\subseteq I,

P ⁣(k=1nAik)  =  k=1nP(Aik). \mathbb P\!\left(\bigcap_{k=1}^n A_{i_k}\right)\;=\;\prod_{k=1}^n \mathbb P(A_{i_k}).

Independence can be expressed in terms of : if P(B)>0\mathbb P(B)>0, then AA and BB are independent exactly when P(AB)=P(A)\mathbb P(A\mid B)=\mathbb P(A). This notion extends from events to and to .

Examples:

  • In two independent coin flips, let A={first flip is H}A=\{\text{first flip is H}\} and B={second flip is H}B=\{\text{second flip is H}\}. Then P(A)=P(B)=1/2\mathbb P(A)=\mathbb P(B)=1/2 and P(AB)=1/4\mathbb P(A\cap B)=1/4, so AA and BB are independent.
  • For one fair die roll, let A={even}A=\{\text{even}\} and B={roll3}B=\{\text{roll}\le 3\}. Then P(A)=1/2\mathbb P(A)=1/2, P(B)=1/2\mathbb P(B)=1/2, but P(AB)=P({2})=1/61/4\mathbb P(A\cap B)=\mathbb P(\{2\})=1/6\ne 1/4, so AA and BB are not independent.