Interval of convergence in a power series is the set of all values of a particular variable, usually x, for which the series converges. To determine the interval of convergence, the ratio test must be conducted, and the radius of convergence calculated.
For a given power series, finding the interval of convergence starts with the ratio test, which involves calculating a value L. To do this on a given power series presented as a function of x, all of the values of n must be increased by one and the entire term multiplied by the reciprocal of the original function. The value L is equal to the limit of the absolute value of the product of the two functions as n approaches infinity. The radius of convergence is determined by getting the orientation of the elements in the series. The power series will converge if L is less than one, and diverge if L is greater than one.
Finally, to find the interval of convergence, the conditions set in the radius of convergence should be converted to an inequality that produces two endpoints. Both endpoints will be plugged into the original power series to determine if the series diverges or converges at these two points. The resulting interval of convergence is presented in the form of an inequality that represents the power series.