If ~X is a discrete random variable, the #~{probability generating function} ( #~{p.g.f.} ) of ~X is a function , _ &Pi._~X #: [ -1 , 1 ] -> &reals. , _ defined as
&Pi._~X (~t) _ _ #:= _ _ E( ~t ^~X )
If the values of ~X are non negative integers, then the p.g.f. is given by the power series:
&Pi._~X (~t) _ = _ sum{ _ ~p_~x ~t ^~x,~x = 0 , &infty.} _ = _ _ ~p_0 _ + _ ~p_1 ~t _ + _ ~p_2 ~t ^2 _ + _ ...
which converges for _ | ~t | =< 1 . _ Note that for ~t = 1 _ we have _ &Pi._~X (1) _ = _ &sum._~x ~p_~x _ = _ 1.
Conversely if &exist. a sequence \{ ~a_~x \}_{~x &in. Z^+} _ such that _ ~a_~x >= 0 , &forall. ~x , _ and _ &sum._~x ~a_~x _ = _ 1 , _ then this defines an integer valued distribution and _ &Pi._~X #: [ -1 , 1 ] -> &reals. , _ defined by _ &Pi. (~t) = &sum._~x _ ~p_~x ~t ^~x _ is the p.g.f.
So a p.g.f. is uniquely determined by, and determines the distribution of a random variable.
Differentiating with respect to ~t
&Pi.'_~X (~t) _ = _ sum{~x ~p_~x ~t ^{~x - 1},~x = 1 , &infty.}
and
&Pi.''_~X (~t) _ = _ sum{~x ( ~x - 1 ) ~p_~x ~t ^{~x - 2},~x = 2 , &infty.}
In particular
&Pi.'_~X (1) _ = _ sum{~x ~p_~x ,~x = 1 , &infty.} _ = _ E ~X _ _ _ _ _ [ the term for ~x = 0 being zero ]
and
&Pi.''_~X (1) _ = _ sum{( ~x^2 - ~x ) ~p_~x,~x = 2 , &infty.}
_ _ _ = _ sum{( ~x^2 - ~x ) ~p_~x,~x = 0 , &infty.} _ _ _ _ [ as ( ~x^2 - ~x ) = 0 _ for _ ~x = 0 _ and _ ~x = 1 ]
_ _ _ = _ E ~X ^2 - E ~X
Now _ _ var ~X _ _ = _ _ E ( ~X ^2 ) - ( E~X )^2
So we have:
&mu. _ _ = _ _ E ~X _ _ = _ _ &Pi.'_~X (1) var ~X _ _ = _ _ &Pi.''_~X (1) _ + _ &mu. _ - _ &mu.^2 |
If ~X and ~Y are independent discrete random variables, with p.g.f. &Pi._~X and &Pi._~Y respectively, and &alpha., &beta. &in. Z^+, then
Proof:
Corollary:
If _ ~X_~i , ~i = 1, ... ~n , is a sequence of independent, identically distributed, discrete random variables, each with p.g.f. &Pi._~X , _ and _ Z = &sum._~i ~X_~i , then
_ _ _ _ &Pi._~Z ( ~t ) _ = _ ( &Pi._~X ( ~t ) )^~n