Variance & Covariance

Page Contents

Expectation as a Linear Operator

For any random variables ~X and ~Y (not only discrete) providing they have finite expectation, we have:

E ( ~a~X + ~b~Y ) _ _ = _ _ ~a E~X + ~b E~Y _

In particular

E ( ~c ) _ _ = _ _ ~c _ _ _ when ~c is a constant.

Covariance and Variance

The #~{variance} of a random variable ~X is defined as:

var ~X _ _ #:= _ _ E ( ~X - E~X )^2 _ _ >= _ _ 0

Covariance and Variance

If ~X and ~Y are joint random variables, their #~{covariance} is defined as:

cov ( ~X , ~Y ) _ _ #:= _ _ E ( ~X - E~X )( ~Y - E~Y )

Note that:

var ~X _ _ = _ _ cov ( ~X , ~X ) _ _ = _ _ E ( ~X - E~X )^2

We have

cov ( ~X , ~Y ) _ = _ E ( ~X~Y - ~Y E~X - ~X E~Y + E~X E~Y ) _ = _ E ( ~X~Y ) - E~Y E~X - E~X E~Y + E~X E~Y _ = _ E ( ~X~Y ) - E~X E~Y

cov ( ~X , ~Y ) _ = _ cov ( ~Y , ~X )

cov ( ~a~X , ~Y ) _ = _ ~a cov ( ~X , ~Y )

var ~X _ _ = _ _ E ( ~X ^2 ) - ( E~X )^2

var ~X _ >= _ 0 _ _ => _ _ E ( ~X ^2 ) _ >= _ ( E~X )^2

var ( ~a~X ) _ = _ E ( ( ~a~X ) ^2 ) - ( E ( ~a~X ) )^2 _ = _ ~a^2 ( E ( ~X ^2 ) - ( E~X )^2 ) _ = _ ~a^2 var ~X

var ( -~X ) _ _ = _ _ var ~X

var ( ~X + ~Y ) _ = _ E ( ( ~X + ~Y ) ^2 ) - ( E( ~X + ~Y ) )^2 _ = _ E ( ~X ^2 ) + E ( ~Y ^2 ) + 2 E ( ~X~Y ) - ( E~X )^2 - ( E~Y )^2 - 2 E~X E~Y

_ _ _ _ _ _ _ = _ var ~X + var ~Y + 2 cov ( ~X , ~Y )

var ( ~X - ~Y ) _ = _ var ~X + var ~Y - 2 cov ( ~X , ~Y )

Covariance of Independent Random Variables

~X , _ ~Y _ independent then

E( ~X~Y ) _ _ = _ _ E~X E~Y

_ => _ _ cov ( ~X , ~Y ) _ _ = _ _ 0

[ Note _ _ cov ( ~X , ~Y ) _ = _ 0 _ _ does not necessarily imply that ~X and ~Y are independent. ]

=> _ _ var ( ~X + ~Y ) _ _ = _ _ var ~X + var ~Y _ _ _ and _ _ _ var ( ~X - ~Y ) _ _ = _ _ var ~X + var ~Y

Correlation

Define the #~{correlation coefficient}, _ &rho. , _ between two random variables

&rho. ( ~X , ~Y ) _ _ : = _ _ fract{cov ( ~X {,} ~Y ) , &sqrt.${ var ~X var ~Y _ }}

_ _ _ _ _ _ _ _ = _ _ fract{E ( ( ~X - E~X )( ~Y - E~Y ) ), &sqrt.${ E ( ~X - E~X )&powtwo. _ E ( ~Y - E~Y )&powtwo.}}