~X, ~Y random variables, their joint distribution
F_{~X,~Y} ( ~x , ~y ) _ = _ P ( ~X =< ~x , ~Y =< ~y )
Generalizing: If _ ~#X = ( ~X_1 , ... , ~X_~n) , _ where ~X_~i are random variables,
F_{~#X} ( ~#x ) _ = _ P ( ~X_~i =< ~x_~i ; ~i = 1, ... ~n )
~#X is called a #~{random vector}. The component random variables can be a mix of discrete and continuous, though this is not usually dealt with. #~X is called a #~{continuous random vector} if all the components are continuous or a #~{discrete random vector} if all the components are discrete.
If ~#X is a discrete random vector, the #~{joint probability function} _ ~p ( ~#x ) _ is defined
~p ( ~#x ) _ _ #:= _ _ P ( ~X_1 = ~x_1 , ~X_2 = ~x_2 , ... , ~X_~n = ~x_~n ) , _ _ _ ~x_~i &in. &Omega._{~X_~i} , _ &forall. ~i
If _ _ _ F_{~#X} ( ~x_1 , ... , ~x_~n ) _ = _ int{ ... ,-&infty.,~x_1,}int{,-&infty.,~~x_n,} ~f ( ~u_1 , ... , ~u_~n ) d~u_~n ... d~u_1
then _ ~f ( ~u_1 , ... , ~u_~n ) is called the #~{joint density function} (with respect to integration).
~#X = ( ~X_1 , ... , ~X_~n ) is a random vector, the #~{marginal distribution} of ~X_~i is defined:
F_{~X_~i} ( ~x_~i ) _ #:= _ F_{~#X} ( &infty. , &infty. , ... , ~x_~i , ... , &infty. )
For a discrete random vector
F_{~i} ( ~a ) _ = _ sum{ ... ,~x_1 = -&infty.,&infty.}sum{ ... ,~x_~i = -&infty.,~a}sum{,~x_~n = -&infty.,&infty.} rndb{ {P ( ~X_1 = ~x_1 , ~X_2 = ~x_2 , ... , ~X_~n = ~x_~n )} }
For a continuous random vector the ~i^{th} marginal distribution is:
F_{~i} ( ~a ) _ = _ int{ ... ,-&infty.,&infty.,}int{ ... ,~x_~i = -&infty.,~a,}int{,-&infty.,&infty.,} f ( ~u_1 , ... , ~u_~n ) _ d~u_~n ... d~u_1
The ~i^{th} marginal density is given by
f_{~i} ( ~x ) _ = _ fract{dF_{~i},d ~x} ( ~x )
Random variables ~X and ~Y are #~{independent} if
F_{~X,~Y} ( ~x , ~y ) _ = _ F_~X ( ~x ) F_~Y ( ~y ) , _ _ &all. ~x , ~y
If ~X and ~Y are discrete, then
~X and ~Y independent _ <=> _ P ( ~X = ~x , ~Y = ~y ) _ = _ P ( ~X = ~x ) P ( ~Y = ~y ) , _ &all. ~x , ~y
If ~X and ~Y are continuous, then
~X and ~Y independent _ <=> _ f_{~X,~Y} ( ~x , ~y ) _ = _ f_~X ( ~x ) f_~Y ( ~y ) , _ _ &all. ~x , ~y
_
Generalizing: random variables ~X_1 , ... , ~X_~n are #~{independent} if
F_{~#X} ( ~x_1 , ... , ~x_~n ) _ = _ F_1 ( ~x_1 ) F_2 ( ~x_2 ) ... F_~n ( ~n ) , _ _ &all. ~x_1 , ... , ~x_~n
The resulst for the probability densities extend to the general case.