Convergence Tests

Relevant For...

Recall that the sum of an infinite series \( \sum\limits_{n=1}^\infty a_n \) is defined to be the limit \( \lim\limits_{k\to\infty} s_k \), where \( s_k = \sum\limits_{n=1}^k a_n \). If the limit exists, the series converges; otherwise it diverges.

Many important series do not admit an easy closed-form formula for \( s_k \). In this situation, one can often determine whether a given series converges or diverges without explicitly calculating \( \lim\limits_{k\to\infty} s_k \), via one of the following tests for convergence.

The divergence test does not apply to the harmonic series \( \sum\limits_{n=1}^\infty \frac1{n} \), because \( \lim\limits_{n\to\infty} \frac1{n} = 0 \). In this case, the divergence test gives no information.

It is a common misconception that the "converse" of the divergence test holds, i.e. if the terms go to \( 0 \) then the sum converges. In fact, this is false, and the harmonic series is a counterexample--it diverges (as will be shown in a later section).

Ratio Test

The intuition for the next two tests is the geometric series \( \sum ar^n\), which converges if and only if \( |r|<1 \). The precise statement of the test requires a concept that is used quite often in the study of infinite series.

A series \( \sum\limits_{n=1}^\infty a_n \) is absolutely convergent if \( \sum\limits_{n=1}^\infty |a_n|\) converges. If a series is convergent but not absolutely convergent, it is called conditionally convergent. \(_\square\)

So if \( 4|x|<1 \), the series converges absolutely, and if \( 4|x|>1 \) the series diverges. For \( x = \pm 1/4 \), the question is more delicate. It turns out that the series converges for \( x=-1/4 \) but not \( x=1/4 \). Hence the answer is \( x \in [-1/4,1/4) \). \(_\square\)

The ratio test is quite useful for determining the interval of convergence of power series, along the lines of the above example. Note that at the endpoints of the interval, the ratio test fails.

Root Test

The root test works whenever the ratio test does, but the limit involved in the root test is often more difficult from a practical perspective.

Here, \( \limsup \) denotes the limit of the supremum of a sequence, \( \lim\limits_{n\to\infty} \sup\limits_{m\ge n} \sqrt[m]{|a_m|} \) in this case. If we allow \( r \) to be \( \infty\) (which is taken to be \( >1 \) for purposes of the test), the \( \limsup \) always exists (while the limit might not); if the limit exists then it equals the \( \limsup \). In practice, using the root test usually involves computing the limit.

A fact that is often useful in applications of the root test is that \( \lim\limits_{n\to\infty} n^{1/n} = 1. \) (This follows because the limit of the natural log, \( \frac{\ln n}{n}, \) is \( 0 \) by L'Hopital's rule.)

Integral Test

Often the series \( a_n \) can be extended to a nice function \( f(x) \), and the integral of \( f(x) \) is "close" to the sum.

Integral test:

If \( f(x) \) is a nonnegative, continuous, decreasing function on \( [1,\infty) \), then the series \( \sum\limits_{n=1}^\infty f(n) \) converges if and only if the improper integral \( \int_1^\infty f(x) \, dx \) converges.

Note that it is important that \( f(x) \) is decreasing and continuous, as otherwise it is conceivable that the values of \( f \) at integers might be unrelated to its values everywhere else (e.g. imagine an \( f\) that is 0 except very near integers, where it spikes to \( 1 \); such an \( f \) might have a convergent integral, but the series will diverge).

The \( p \)-series \( \sum\limits_{n=1}^\infty \frac1{n^p} \) are defined for any real number \( p \). For which \( p \) does the associated \( p\)-series converge?

The case \( p = 1 \) is the harmonic series, which diverges because the associated integral
\[
\int_1^\infty \frac1{x} \, dx = \ln x\biggr\rvert_1^\infty
\]
diverges. So the answer is that the \( p \)-series converges if and only if \( p>1 \). \(_\square\)

Comparison Test

This test can determine that a series converges by comparing it to a (simpler) convergent series.

Since \( \frac1{n^2+n+1} < \frac1{n^2} \), and \( \sum \frac1{n^2} \) converges by the integral test (it is a \( p\)-series with \(p>1 \)), the series converges by the comparison test. \(_\square\)

The comparison test can also determine that a series diverges:

Does \( \sum\limits_{n=1}^\infty \frac1{2n-1} \) converge or diverge?

Since \( \frac1{2n} \le \frac1{2n-1} \), if the series converges then so does \( \sum\limits_{n=1}^\infty \frac1{2n} = \frac12 \sum\limits_{n=1}^\infty\frac1n \). But the harmonic series diverges, so the original series must diverge as well. \(_\square\)

The comparison test is useful, but intuitively it feels limited. For instance, \( \frac1{n^2-n+1} \) is not \( \le \frac1{n^2} \), and yet the series \( \sum \frac1{n^2-n+1}\) ought to converge because the terms "behave like" \( \frac1{n^2} \) for large \( n \). A refinement of the comparison test, described in the next section, will handle series like this.

Limit Comparison Test

Instead of comparing to a convergent series using an inequality, it is more flexible to compare to a convergent series using behavior of the terms in the limit.

Alternating Series Test

Alternating series arise naturally in many common situations, including evaluations of Taylor series at negative arguments. They furnish simple examples of conditionally convergent series as well. There is a special test for alternating series that detects conditional convergence:

This follows directly from the alternating series test, if we can show that \( \frac{n}{n^2+25} \) is eventually decreasing. The easiest way to do this is to consider the function \( f(x) = \frac{x}{x^2+25} \) and take its derivative:
\[
f'(x) = \frac{(x^2+25)-x(2x)}{(x^2+25)^2} = \frac{25-x^2}{(x^2+25)^2}.
\]
So \( f'(x) < 0 \) for \( x>5 \), which implies the sequence \( f(n) \) is decreasing for \( n > 5 \). \(_\square\)

One interesting fact about the alternating series test is that it gives an effective error bound as well:

Let \( \sum (-1)^n a_n \) be a series that satisfies the conditions of the alternating series test, and suppose that \( a_n \) is decreasing for \( n \ge 1 \) (not just eventually decreasing). If the sum of the series is \( L \) and the \( k^\text{th}\) partial sum is denoted \( s_k \), then
\[
|L-s_k| \le a_{k+1}.
\]

Give an upper bound for the error in the estimate \( \pi \approx 4-\frac43+\frac45-\cdots -\frac4{399} \).

Assuming that \( \pi \) is the sum of the series \( \sum\limits_{n=0}^\infty (-1)^n \frac4{2n+1} \), the alternating series test says that this error is at most \( 4/401\), which is roughly \( 0.01\). In fact, the sum is \( 3.13659\ldots\), so the error is almost exactly half that, or \( 0.005 \). \(_\square\)

(To show that \( \pi \) is in fact the sum of the series, one possibility is to derive the Taylor series \( \arctan x = x-\frac{x^3}3+\frac{x^5}5-\cdots\) which is valid on \( (-1,1)\), and then to use a theorem of Abel which shows that the identity can be extended to the endpoint \( 1 \) of the interval.)

The alternating series test is actually a special case of the Dirichlet test for convergence, presented in the next section.

The alternating series test is the special case where \( b_n = (-1)^n \) (and \( M = 1 \)).

Let \(a_n\) be a decreasing sequence of real numbers such that \( \lim\limits_{n\to\infty} a_n = 0 \). Show that
\[
\sum_{n=1}^\infty a_n \sin nx
\]
converges for all real numbers \( x \) which are not integer multiples of \( 2\pi\). (This is useful in the theory of Fourier series.)

because the absolute value of the quantity on the right is \( \le \frac1{|\sin \frac12 x|} \), which is a constant real number as long as the denominator is not \( 0 \). (This is why we had to assume that \( x \) was not an integer multiple of \( 2\pi\).) \(_\square\)

Abel Test

Abel's test is similar to Dirichlet's test, and is most useful for conditionally convergent series.

Note that if \( a_n \) is positive (or \( \sum a_n \) is absolutely convergent), this follows immediately from the comparison test (without assumption (2)). So the interesting series to which this applies are conditionally convergent.