Two events are independent if, on a probability space
(Ω,F,P), they satisfy
P(A∩B)=P(A)P(B).A finite or countable family of events (Ai)i∈I is independent if for every finite subset {i1,…,in}⊆I,
P(k=1⋂nAik)=k=1∏nP(Aik).Independence can be expressed in terms of conditional probability
: if P(B)>0, then A and B are independent exactly when P(A∣B)=P(A). This notion extends from events to independence of sigma-algebras
and to independence of random variables
.
Examples:
- In two independent coin flips, let A={first flip is H} and B={second flip is H}. Then P(A)=P(B)=1/2 and P(A∩B)=1/4, so A and B are independent.
- For one fair die roll, let A={even} and B={roll≤3}. Then P(A)=1/2, P(B)=1/2, but P(A∩B)=P({2})=1/6=1/4, so A and B are not independent.