Independence of random variables

Definition of when random variables have factorizing joint probabilities.
Independence of random variables

A family of random variables (Xi)iI(X_i)_{i\in I} on a (Ω,F,P)(\Omega,\mathcal F,\mathbb P) is independent if for every finite choice of indices i1,,ikIi_1,\dots,i_k\in I and every choice of Borel sets A1,,AkRA_1,\dots,A_k\subseteq\mathbb R,

P(Xi1A1,,XikAk)=j=1kP(XijAj). \mathbb P\big(X_{i_1}\in A_1,\dots,X_{i_k}\in A_k\big)=\prod_{j=1}^k \mathbb P\big(X_{i_j}\in A_j\big).

This says that all events of the form {XiA}\{X_i\in A\} behave like under . Equivalently, the σ(Xi)\sigma(X_i) generated by the variables are .

Examples:

  • Let Ω={0,1}2\Omega=\{0,1\}^2 with P\mathbb P uniform, and define X(ω1,ω2)=ω1X(\omega_1,\omega_2)=\omega_1, Y(ω1,ω2)=ω2Y(\omega_1,\omega_2)=\omega_2. Then XX and YY are independent .
  • Let Ω=[0,1]2\Omega=[0,1]^2 with the product (normalized to a probability measure), and set X(u,v)=uX(u,v)=u, Y(u,v)=vY(u,v)=v. Then XX and YY are independent and each has the uniform distribution on [0,1][0,1].