Events A and B are _ #~{independent} _ if _ _ _ P( A &intersect. B ) _ = _ P( A ) P( B )
A collection of events ( A_~i ) _{~{i}&in. I} is said to be #~{independent} if
P( intersect{A_~i,~{i}&in. J, _ } ) _ = _ prod{P( A_~i ),~{i}&in. J, _ } _ _ _ _ any J &subseteq. I
Events ( A_~i ) _{~{i}&in. I} are said to be #~{pairwise independent} if
P( A_~i &intersect. A_~j ) _ = _ P( A_~i ) P( A_~j ) _ _ _ _ any ~i ≠ ~j
Independence &imply. pairwise independence &forall. ~i ≠ ~j, but the converse is not true.
i.e. P( A_~i &intersect. A_~j ) = P( A_~i ) P( A_~j ) &forall. ~i ≠ ~j does not imply that ( A_~i ) _{~{i}&in. I} are independent.
If A and B are events, and P( B ) != 0, then we define the #~{conditional probability} of A given B as:
P( A | B ) _ = _ P( A &intersect. B ) ./ P( B )
Note that if A and B are independent then P( A | B ) = P( A ) .
The operation Q: @A &rightarrow. [0, 1] given by Q( A ) = P( A | B ) is a probability measure called the conditional probability given B.
If ( E_~i )_{~i &in. I}, ( I not necessarily finite ), is a partition of &Omega. [i.e. E_~i &intersect. E_~j = &empty., ~i ≠ ~j, _ &union._{i} E_~i = &Omega.], then if A &in. @A
P( A ) _ = _ sum{ P( A | E_~i ) P( E_~i ), ~i &in. I, _ }
Proof
&sum._~i ( P( A | E_~i ) P( E_~i ) = &sum._~i P( A &intersect. E_~i ) = P( &union._~i( A &intersect. E_~i ) ) _ [E_~i disjoint so ( A &intersect. E_~i ) disjoint] _ = P( A &intersect. &union._~iE_~i ) = P( A &intersect. &Omega. ) = P( A )
If ( E_~i )_{~i &in. I} is a partition of &Omega., then if A &in. @A
P( E_~j | A ) _ = _ fract{ P( A | E_~j ) P( E_~j ) ,sum{ P( A | E_~i ) P( E_~i ), ~i &in. I, } }
Proof
P( E_~j | A ) _ = _ P( E_~j &intersect. A ) ./ P( A ) _ = _ P( A | E_~j ) P( E_~j ) ./ P( A ) _ = _ P( A | E_~j ) P( E_~j ) ./ &sum._~i ( P( A | E_~i ) P( E_~i )