How do you find the interval of convergence of a Taylor series?
General expression of Taylor series of a function f(x) centered at ‘a’ is ;
= f(a) +(x-a) +++.............
Consider , = .
contains factorial notation . So , in this cases we can use ‘ratio test ‘ to find the interval convergence of a Taylor series.
By , the ratio test ;
If P = | | < 1 , then the series converges.
And if P > 1 , then the series diverges .
If P = 1, then the test is inconclusive.
Consider, P = |x|
The ratio test tells us now that the series will converges as long as |x| < 1 . It also tells us that the series will diverges for |x| > 1.
The biggest interval ( it is always an interval ) where a Taylor series is convergent is called interval of convergence of the Taylor series . The interval of convergence is always centered at the center of the Taylor series .