Taylor and Maclaurin Theorems

Page Contents

Taylor's Theorem

Consider _ p ( ~x ) _ = _ ~x ( ~x + 1 ) ( ~x - 2 ) _ = _ ~x^3 - ~x^2 - 2~x _ = _ ( ~x - 1 )^3 - 2 ( ~x - 1 ) - 2

In fact any polynomial can be expressed in the form

p ( ~x ) _ = _ sum{~c_~r ( ~x - ~a )^~r ,~r = 0,~n}

In general, for any function _ f ( ~x ) _ suppose

f ( ~x ) _ = _ sum{~c_~r ( ~x - ~a )^~r ,~r = 0,&infty.}

then _ f ( ~a ) = ~c_0 _ and

f #~' ( ~x ) _ = _ sum{~r~c_~r ( ~x - ~a )^{~r - 1} ,~r = 1,&infty.}

so _ f #~' ( ~a ) _ = _ ~c_1 , _ and continuing to differentiate :

f^{ (~i)}( ~x ) _ = _ sum{fract{~r#!,( ~r - ~i )#!} ~c_~r ( ~x - ~a )^{~r - ~i} ,~r = ~i,&infty.}

f^{ (~i)}( ~a ) _ = _ ~i #! ~c_~i _ _ => _ _ ~c_~i _ = _ f^{ (~i)}( ~a ) / ~i #!.

This suggests that we can express the function in the form

f ( ~x ) _ = _ sum{fract{ ( ~x - ~a )^~r, ~r#!}f ^{(~r)}( ~a ) ,~r = 0,&infty.}

This is formalised in Taylor's theorem:

#{Taylor's Theorem}: _ f#: [ ~a , ~b ] --> &reals. _ such that _ f, f #~', ... f ^{(~n)} _ are continuous on [ ~a , ~b ] and f ^{(~n + 1)} exists in ( ~a , ~b ) , _ then for any ~x &in. [ ~a , ~b ]:

f ( ~x ) _ = _ sum{fract{( ~x - ~a )^~r, ( ~r )#!}f ^{( ~r )}( ~a ),~r = 0,~n} + R_~n( ~x )

where _ R_~n( ~x ) _ = _ fract{( ~x - ~a )^{~n + 1}, ( {~n + 1} )#!}f ^{( {~n + 1} )}( ~c ) _ _ _ _ some ~c &in. ( ~a , ~x )

The term R_~n( ~x ) is known as the #~{remainder term} for an expansion of ~n terms. _ Note that by convention 0^0/0#! = 1 , so the first term in the expression is just _ f ( ~a ).

Proof:

For brevity write

P_~n ( ~t ) _ = _ sum{fract{( ~t - ~a )^~r, ( ~r )#!}f ^{( ~r )}( ~a ),~r = 0,~n}

I.e. we have _ f ( ~x ) _ = _ P_~n ( ~x ) + R_~n ( ~x ) . _ Then define the constant:

~k _ _ #:= _ _ fract{ ( {~n + 1} )#!, ( ~x - ~a )^{~n + 1}} ( f ( ~x ) - P_~n( ~x ) )

and define the function _ g _ on [ ~a , ~b ]

g ( ~t ) _ = _ f ( ~t ) _ - _ P_~n ( ~t ) _ - _ ~k fract{ ( ~t - ~a )^{~n + 1}, ( {~n + 1} )#!, }

We have
_ _ _ _ g ( ~x ) _ = _ f ( ~x ) - P_~n ( ~x ) - 1 ( f ( ~x ) - P_~n ( ~x ) ) _ = _ 0
_ _ _ _ g ( ~a ) _ = _ f ( ~a ) - P_~n ( ~a ) - ~k 0 _ = _ 0 , _ _ [ as _ P_~n ( ~a ) = f ( ~a ) ]
so by _ Rolle's Theorem _ &exist. ~c_1 &in. ( ~a , ~b ) such that g#~' ( ~c_1 ) = 0.

Now

g#~' ( ~t ) _ _ _ = _ _ _ f #~' ( ~t ) _ - _ sum{fract{( ~t - ~a )^~{r - 1 }, ( ~r - 1 )#!}f ^{( ~r )}( ~a ),~r = 1,~n} _ - _ ~k fract{ ( ~t - ~a )^~n, ( ~n )#!, }

We have _ g#~' ( ~c_1 ) _ = _ 0 , _ and it can be seen that _ g#~' ( ~a ) _ = _ f #~' ( ~a ) - f #~' ( ~a ) - ~k 0 _ = _ 0
so &exist. ~c_2 &in. ( ~a , ~b ) _ such that _ g^{(2)}( ~c_2 ) = 0. _ We continue inductively:

g^{(~i)}( ~t ) _ _ _ = _ _ _ f^{(~i)}( ~t ) _ - _ sum{fract{( ~t - ~a )^~{r - ~i }, ( ~r - ~i )#!}f ^{( ~r )}( ~a ),~r = ~i,~n} _ - _ ~k fract{ ( ~t - ~a )^{~n + 1 - ~i}, ( {~n + 1 - ~i} )#!, }

&exist. ~c_~i _ such that _ g^{(~i)}( ~c_~i ) _ = _ 0 , _ and _ g^{(~i)}( ~a ) _ = _ 0 , _ so &exist. ~c_{~i + 1} _ such that _ g^{(~i + 1)}( ~c_{~i + 1} ) _ = _ 0 _ etc.

So Finally &exist. ~c_{~n + 1} _ such that _ g^{(~n + 1)}( ~c_{~n + 1} ) _ = _ f^{(~n + 1)}( ~c_{~n + 1} ) - ~k _ = 0 _ => _ f^{(~n + 1)}( ~c_{~n + 1} ) _ = _ ~k

I.e.

f ( ~x ) _ _ _ = _ _ _ P_~n ( ~x ) _ + _ fract{( ~x - ~a )^{~n + 1}, ( {~n + 1} )#!}f ^{( {~n + 1} )}( ~c_{~n + 1} ) .

Taylor Series Expansion

If the remainder term, _ R_~n ( ~x ) -> 0 _ as _ ~n -> &infty. _ &forall. ~x &in. ( ~a , ~b ) _ then the function can be expressed as an infinite power series in the interval:

f ( ~x ) _ = _ sum{fract{( ~x - ~a )^~r, ( ~r )#!}f ^{( ~r )}( ~a ),~r = 0,&infty.}

This is known as the #{Taylor series expansion} of _ f ( ~x ) _ about ~a.

Maclaurins Series Expansion

This is a special case of the Taylor expansion when ~a = 0.

f ( ~x ) _ = _ sum{fract{( ~x )^~r, ( ~r )#!}f ^{( ~r )}( 0 ),~r = 0,&infty.}

#{Example}: _ f ( ~x ) _ = _ exp ~x .
We know that exp (0) = 1 , _ and _ f #~' ( ~x ) = exp ~x , _ f ^{(~i)}( ~x ) = exp ~x . _ &therefore. _ f ^{(~i)}(0) = 1 _ &forall. ~i . _ so

exp ~x _ = _ sum{fract{~x^~r,~r#!},0,~n} + fract{exp &theta.~x,( ~n + 1 )#!}~x^{~n + 1} , _ _ some _ 0 < &theta. < 1

The remainder term _ ( exp &theta.~x #/ ( ~n + 1 )#! ) ~x^{~n + 1} _ -> _ 0 _ as _ ~n _ -> _ &infty. . _ So Taylor's theorem gives a convergent infinite series for the exponential function.

Differentiating Infinite Series

  1. If _ &sum._0^{&infty.} ~c_~n ~x^~n _ is absolutely convergent for | ~x | < ~K , _ then _ &sum._1^{&infty.} ~n ~c_~n ~x^{~n - 1} _ is also absolutely convergent for | ~x | < ~K
  2. If _ f ( ~x ) _ = _ &sum._0^{&infty.} ~c_~n ~x^~n _ for | ~x | < ~R , _ then _ f #~' ( ~x ) _ = _ &sum._1^{&infty.} ~n ~c_~n ~x^{~n - 1} , _ for | ~x | < ~R

Proof:
To be provided

#{Example}: _ Let

f ( ~x ) _ = _ sum{( -1 )^{~n - 1} fract{~x^~n,~n},~n = 1,&infty.} _ = _ ~x - fract{~x^2,2} + fract{~x^3,3} ... _ _ | ~x | < 1

f #~' ( ~x ) _ = _ sum{( -1 )^{~n - 1} ~x^~{n - 1},~n = 1,&infty.} _ = _ sum{( -~x )^~n,0,{&infty.}} _ = _ fract{1, ( 1 + ~x )} _ _ | ~x | < 1

- geometric series.

But we know that D ( log ( 1 + ~x ) ) _ = _ 1 / ( 1 + ~x ) _ &forall. ~x > -1
so _ D ( log ( 1 + ~x ) - f ( ~x ) ) _ = _ 0 _ for | ~x | < 1 , _ and therefore _ log ( 1 + ~x ) - f ( ~x ) _ = _ constant _ = _ 0 , _ (substituting ~x = 0), _ so

log ( 1 + ~x ) _ = _ sum{( -1 )^{~n - 1} fract{~x^~n,~n},~n = 1,&infty.} , _ _ | ~x | < 1

Binomial Theorem

For any real number _ ~m

( 1 + ~x )^~m _ = _ sum{comb{~m,~n} ~x^~n,~n = 0, &infty.} _ _ _ _ | ~x | < 1

where

comb{~m,~n} _ = _ fract{~m#! , ~n#! ( ~m - ~n )#!}

_ = _ fract{~m ( ~m - 1 ) ... ( ~m - ~n + 1 ), ~n#!} _ _ _ _ ~n >= 1

_ = _ 1 _ _ _ _ ~n = 0 _ ( conventionally 0#! = 1 )

Proof:
To be provided

#{Example}: _ cos#: [ 0 , &pi. ] --> &reals. . _ cos [ 0 , &pi. ] _ = _ [ -1 , 1 ] _ D cos ~x _ = _ -sin ~x _ < _ 0 , _ ~x &in. [ 0 , &pi. ] .
By inverse function theoerem &exist. _ cos^{-1}#: [ -1 , 1 ] --> &reals. _ given by _ ~x = cos^{-1}~y _ <=> _ ~y = cos ~x , _ ~y &in. [ -1 , 1 ] , _ ~x &in. [ 0 , &pi. ] .
D cos^{-1}~y _ = _ 1 / D cos ~x _ = _ 1/ -sin ~x = 1 / - &sqrt.( 1 - ~y^2 ).
Similarly sin^{-1} [ -1 , 1 ] --> &reals. _ given by _ ~x = sin^{-1} y _ <=> _ ~y = sin ~x , _ ~y &in. [ -1 , 1 ] , _ ~x &in. [ -&pi./2 , &pi./2 ].
D sin^{-1}~y _ = _ 1 / D sin ~x _ = _ 1/ cos ~x = 1 / &sqrt.( 1 - ~y^2 ).
By binomial theorem

fract{1,&sqrt.( 1 - ~y^2 )} _ = _ ( 1 - ~y^2 )^{- 1/2 } _ = _ sum{comb{- 1/2 , ~n} ( {-~y} )^{2~n},0,&infty.} , _ _ _ _ | ~y | < 1

_ = _ 1 - rndb{fract{1,2}} ~y^2 + rndb{fract{fract{1,2}.fract{3,2},1 . 2}} ~y^4 - rndb{fract{fract{1,2}.fract{3,2}.fract{5,2},1 . 2 . 3}} ~y^6 ...

Let

f ( ~y ) _ = _ ~y + rndb{fract{1,2}} fract{~y^3,3} - rndb{fract{1 . 3,2 . 4}} fract{~y^5,5} ...


f ( ~y ) is absolutely convergent by the ratio test.
f #~' ( ~y ) _ = _ 1 / &sqrt. ( 1 - ~y^2 ) _ = _ D sin^{-1} ~x , _ _ for | ~x | < 1

So _ sin^{-1} ~y - f #~' ( ~y ) _ = _ constant _ ( by substituting _ ~x = 0 ), _ therefore:

sin^{-1} ~y _ = _ ~y + rndb{fract{1,2}} fract{~y^3,3} - rndb{fract{1 . 3,2 . 4}} fract{~y^5,5} ...

Cauchy Mean Value Theorem

If _ f _ and _ g _ are continuous functions on [ ~a , ~b ], differentiable on ( ~a , ~b ), and g#~' ( ~x ) != 0 _ &forall. ~x &in. ( ~a , ~b ), _ then &exist. _ ~c &in. ( ~a , ~b ) _ such that

fract{f ( ~b ) - f ( ~a ),g ( ~b ) - g ( ~a )} _ = _ fract{f #~' ( ~c ),g#~' ( ~c )}

Proof:
g#~' ( ~x ) != 0 _ anywhere on ( ~a , ~b ) _ => _ g ( ~a ) != g ( ~b ) _ by Rolle's Theorem .

Let _ ~k = ( f ( ~b ) - f ( ~a ) ) &fslash. ( g ( ~b ) - g ( ~a ) ) , _ and define F ( ~x ) = f ( ~x ) - ~k g ( ~x ) . _ Then _ F ( ~a ) = F ( ~b ) . _ So by Rolle's Theorem &exist. ~c &in. ( ~a , ~b ) _ such that _ F #~' ( ~c ) = 0 _ => _ f #~' ( ~c ) = ~k g#~' ( ~c ) .

L'Hopital's Rule

If _ f ( ~x ) -> 0 , _ and _ g ( ~x ) -> 0 , as ~x -> ~a , _ and _ f #~' ( ~x ) ./ g ( ~x ) -> ~l , _ as ~x -> 0 , _ then _ f ( ~x ) ./ g ( ~x ) _ -> _ ~l , _ as ~x -> ~a .

Proof:
To be provided