Skip to main content

Section 2.5 Series

Note: 2 lectures

A fundamental object in mathematics is that of a series. In fact, when the foundations of analysis were being developed, the motivation was to understand series. Understanding series is important in applications of analysis. For example, solutions to differential equations are often given as series, and differential equations are the basis for understanding almost all of modern science.

Subsection 2.5.1 Definition

Definition 2.5.1.

Given a sequence \(\{ x_n \}\text{,}\) we write the formal object

\begin{equation*} \sum_{n=1}^\infty x_n \qquad \text{or sometimes just} \qquad \sum x_n \end{equation*}

and call it a series. A series converges if the sequence \(\{ s_k \}\) defined by

\begin{equation*} s_k := \sum_{n=1}^k x_n = x_1 + x_2 + \cdots + x_k , \end{equation*}

converges. The numbers \(s_k\) are called partial sums. If \(x := \lim\, s_k\text{,}\) we write

\begin{equation*} \sum_{n=1}^\infty x_n = x . \end{equation*}

In this case, we cheat a little and treat \(\sum_{n=1}^\infty x_n\) as a number.

If the sequence \(\{ s_k \}\) diverges, we say the series is divergent. In this case, \(\sum x_n\) is simply a formal object and not a number.

In other words, for a convergent series, we have

\begin{equation*} \sum_{n=1}^\infty x_n = \lim_{k\to\infty} \sum_{n=1}^k x_n . \end{equation*}

We only have this equality if the limit on the right actually exists. If the series does not converge, the right-hand side does not make sense (the limit does not exist). Therefore, be careful as \(\sum x_n\) means two different things (a notation for the series itself or the limit of the partial sums), and you must use context to distinguish.

Remark 2.5.2.

It is sometimes convenient to start the series at an index different from 1. For instance, we can write

\begin{equation*} \sum_{n=0}^\infty r^n = \sum_{n=1}^\infty r^{n-1} . \end{equation*}

The left-hand side is more convenient to write.

Remark 2.5.3.

It is common to write the series \(\sum x_n\) as

\begin{equation*} x_1 + x_2 + x_3 + \cdots \end{equation*}

with the understanding that the ellipsis indicates a series and not a simple sum. We do not use this notation as it is the sort of informal notation that leads to mistakes in proofs.

Example 2.5.4.

The series

\begin{equation*} \sum_{n=1}^\infty \frac{1}{2^n} \end{equation*}

converges and the limit is 1. That is,

\begin{equation*} \sum_{n=1}^\infty \frac{1}{2^n} = \lim_{k\to\infty} \sum_{n=1}^k \frac{1}{2^n} = 1 . \end{equation*}

Proof: First we prove the following equality

\begin{equation*} \left( \sum_{n=1}^k \frac{1}{2^n} \right) + \frac{1}{2^k} = 1 . \end{equation*}

The equality is immediate when \(k=1\text{.}\) The proof for general \(k\) follows by induction, which we leave to the reader. See Figure 2.7 for an illustration.

Figure 2.7. The equality \(\left( \sum_{n=1}^k \frac{1}{2^n} \right) + \frac{1}{2^k} = 1\) illustrated for \(k=3\text{.}\)

Let \(s_k\) be the partial sum. We write

\begin{equation*} \abs{ 1 - s_k } = \abs{ 1 - \sum_{n=1}^k \frac{1}{2^n} } = \abs{\frac{1}{2^k}} = \frac{1}{2^k} . \end{equation*}

The sequence \(\bigl\{ \frac{1}{2^k} \bigr\}\text{,}\) and therefore \(\bigl\{ \abs{1-s_k} \bigr\}\text{,}\) converges to zero. So, \(\{ s_k \}\) converges to 1.

Details of the proof are left as an exercise. The proof consists of showing

\begin{equation*} \sum_{n=0}^{k-1} r^n = \frac{1-r^k}{1-r} , \end{equation*}

and then taking the limit as \(k\) goes to \(\infty\text{.}\) Geometric series is one of the most important series, and in fact it is one of the few series for which we can so explicitly find the limit.

As for sequences we can talk about a tail of a series.


We look at partial sums of the two series (for \(k \geq M\))

\begin{equation*} \sum_{n=1}^{k} x_n = \left( \sum_{n=1}^{M-1} x_n \right) + \sum_{n=M}^{k} x_n . \end{equation*}

Note that \(\sum_{n=1}^{M-1} x_n\) is a fixed number. Use Proposition 2.2.5 to finish the proof.

Subsection 2.5.2 Cauchy series

Definition 2.5.7.

A series \(\sum x_n\) is said to be Cauchy or a Cauchy series if the sequence of partial sums \(\{ s_n \}\) is a Cauchy sequence.

A sequence of real numbers converges if and only if it is Cauchy. Therefore, a series is convergent if and only if it is Cauchy. The series \(\sum x_n\) is Cauchy if for every \(\epsilon > 0\text{,}\) there exists an \(M \in \N\text{,}\) such that for every \(n \geq M\) and \(k \geq M\text{,}\) we have

\begin{equation*} \abs{ \left( \sum_{j=1}^k x_j \right) - \left( \sum_{j=1}^n x_j \right) } < \epsilon . \end{equation*}

Without loss of generality we assume \(n < k\text{.}\) Then we write

\begin{equation*} \abs{ \left( \sum_{j=1}^k x_j \right) - \left( \sum_{j=1}^n x_j \right) } = \abs{ \sum_{j={n+1}}^k x_j } < \epsilon . \end{equation*}

We have proved the following simple proposition.

Subsection 2.5.3 Basic properties


Let \(\epsilon > 0\) be given. As \(\sum x_n\) is convergent, it is Cauchy. Thus we find an \(M\) such that for every \(n \geq M\text{,}\) we have

\begin{equation*} \epsilon > \abs{ \sum_{j={n+1}}^{n+1} x_j } = \abs{ x_{n+1} } . \end{equation*}

Hence for every \(n \geq M+1\text{,}\) we have \(\abs{x_{n}} < \epsilon\text{.}\)

Example 2.5.10.

If \(r \geq 1\) or \(r \leq -1\text{,}\) then the geometric series \(\sum_{n=0}^\infty r^n\) diverges.

Proof: \(\abs{r^n} = \abs{r}^n \geq 1^n = 1\text{.}\) So the terms do not go to zero and the series cannot converge.

So if a series converges, the terms of the series go to zero. The implication, however, goes only one way. Let us give an example.

Example 2.5.11.

The series \(\sum \frac{1}{n}\) diverges (despite the fact that \(\lim \frac{1}{n} = 0\)). This is the famous harmonic series 1 .

Proof: We will show that the sequence of partial sums is unbounded, and hence cannot converge. Write the partial sums \(s_n\) for \(n = 2^k\) as:

\begin{equation*} \begin{aligned} s_1 & = 1 , \\ s_2 & = \left( 1 \right) + \left( \frac{1}{2} \right) , \\ s_4 & = \left( 1 \right) + \left( \frac{1}{2} \right) + \left( \frac{1}{3} + \frac{1}{4} \right) , \\ s_8 & = \left( 1 \right) + \left( \frac{1}{2} \right) + \left( \frac{1}{3} + \frac{1}{4} \right) + \left( \frac{1}{5} + \frac{1}{6} + \frac{1}{7} + \frac{1}{8} \right) , \\ & ~~ \vdots \\ s_{2^k} & = 1 + \sum_{j=1}^k \left( \sum_{m=2^{j-1}+1}^{2^j} \frac{1}{m} \right) . \end{aligned} \end{equation*}

Notice \(\nicefrac{1}{3} + \nicefrac{1}{4} \geq \nicefrac{1}{4} + \nicefrac{1}{4} = \nicefrac{1}{2}\) and \(\nicefrac{1}{5} + \nicefrac{1}{6} + \nicefrac{1}{7} + \nicefrac{1}{8} \geq \nicefrac{1}{8} + \nicefrac{1}{8} + \nicefrac{1}{8} + \nicefrac{1}{8} = \nicefrac{1}{2}\text{.}\) More generally

\begin{equation*} \sum_{m=2^{k-1}+1}^{2^k} \frac{1}{m} \geq \sum_{m=2^{k-1}+1}^{2^k} \frac{1}{2^k} = (2^{k-1}) \frac{1}{2^k} = \frac{1}{2} . \end{equation*}


\begin{equation*} s_{2^k} = 1 + \sum_{j=1}^k \left( \sum_{m=2^{j-1}+1}^{2^j} \frac{1}{m} \right) \geq 1 + \sum_{j=1}^k \frac{1}{2} = 1 + \frac{k}{2} . \end{equation*}

As \(\{ \frac{k}{2} \}\) is unbounded by the Archimedean property, that means that \(\{ s_{2^k} \}\) is unbounded, and therefore \(\{ s_n \}\) is unbounded. Hence \(\{ s_n \}\) diverges, and consequently \(\sum \frac{1}{n}\) diverges.

Convergent series are linear. That is, we can multiply them by constants and add them and these operations are done term by term.


For the first item, we simply write the \(k\)th partial sum

\begin{equation*} \sum_{n=1}^k \alpha x_n = \alpha \left( \sum_{n=1}^k x_n \right) . \end{equation*}

We look at the right-hand side and note that the constant multiple of a convergent sequence is convergent. Hence, we take the limit of both sides to obtain the result.

For the second item we also look at the \(k\)th partial sum

\begin{equation*} \sum_{n=1}^k ( x_n + y_n ) = \left( \sum_{n=1}^k x_n \right) + \left( \sum_{n=1}^k y_n \right) . \end{equation*}

We look at the right-hand side and note that the sum of convergent sequences is convergent. Hence, we take the limit of both sides to obtain the proposition.

An example of a useful application of the first item is the following formula. If \(\abs{r} < 1\) and \(j \in \N\text{,}\) then

\begin{equation*} \sum_{n=j}^\infty r^n = \frac{r^j}{1-r} . \end{equation*}

The formula follows by using the geometric series and multiplying by \(r^j\text{:}\)

\begin{equation*} r^j \sum_{n=0}^\infty r^n = \sum_{n=0}^\infty r^{n+j} = \sum_{n=j}^\infty r^n . \end{equation*}

Multiplying series is not as simple as adding, see the next section. It is not true, of course, that we multiply term by term. That strategy does not work even for finite sums: \((a+b)(c+d) \not= ac+bd\text{.}\)

Subsection 2.5.4 Absolute convergence

As monotone sequences are easier to work with than arbitrary sequences, it is usually easier to work with series \(\sum x_n\text{,}\) where \(x_n \geq 0\) for all \(n\text{.}\) The sequence of partial sums is then monotone increasing and converges if it is bounded above. Let us formalize this statement as a proposition.

As the limit of a monotone increasing sequence is the supremum, then when \(x_n \geq 0\) for all \(n\text{,}\) we have the inequality

\begin{equation*} \sum_{n=1}^k x_n \leq \sum_{n=1}^\infty x_n . \end{equation*}

If we allow infinite limits, the inequality still holds even when the series diverges to infinity, although in that case it is not terribly useful.

We will see that the following common criterion for convergence of series has big implications for how the series can be manipulated.

Definition 2.5.14.

A series \(\sum x_n\) converges absolutely if the series \(\sum \abs{x_n}\) converges. If a series converges, but does not converge absolutely, we say it is conditionally convergent.


A series is convergent if and only if it is Cauchy. Hence suppose \(\sum \abs{x_n}\) is Cauchy. That is, for every \(\epsilon > 0\text{,}\) there exists an \(M\) such that for all \(k \geq M\) and all \(n > k\text{,}\) we have

\begin{equation*} \sum_{j=k+1}^n \abs{x_j} = \abs{ \sum_{j=k+1}^n \abs{x_j} } < \epsilon . \end{equation*}

We apply the triangle inequality for a finite sum to obtain

\begin{equation*} \abs{ \sum_{j=k+1}^n x_j } \leq \sum_{j=k+1}^n \abs{x_j} < \epsilon . \end{equation*}

Hence \(\sum x_n\) is Cauchy, and therefore it converges.

If \(\sum x_n\) converges absolutely, the limits of \(\sum x_n\) and \(\sum \abs{x_n}\) are generally different. Computing one does not help us compute the other. However, the computation above leads to a useful inequality for absolutely convergent series, a series version of the triangle inequality, a proof of which we leave as an exercise:

\begin{equation*} \abs{ \sum_{j=1}^\infty x_j } \leq \sum_{j=1}^\infty \abs{x_j} . \end{equation*}

Absolutely convergent series have many wonderful properties. For example, absolutely convergent series can be rearranged arbitrarily, or we can multiply such series together easily. Conditionally convergent series on the other hand often do not behave as one would expect. See the next section.

We leave as an exercise to show that

\begin{equation*} \sum_{n=1}^\infty \frac{{(-1)}^n}{n} \end{equation*}

converges, although the reader should finish this section before trying. On the other hand, we proved

\begin{equation*} \sum_{n=1}^\infty \frac{1}{n} \end{equation*}

diverges. Therefore, \(\sum \frac{{(-1)}^n}{n}\) is a conditionally convergent series.

Subsection 2.5.5 Comparison test and the \(p\)-series

We noted above that for a series to converge the terms not only have to go to zero, but they have to go to zero “fast enough.” If we know about convergence of a certain series, we can use the following comparison test to see if the terms of another series go to zero “fast enough.”


As the terms of the series are all nonnegative, the sequences of partial sums are both monotone increasing. Since \(x_n \leq y_n\) for all \(n\text{,}\) the partial sums satisfy for all \(k\)

\begin{equation} \sum_{n=1}^k x_n \leq \sum_{n=1}^k y_n .\tag{2.1} \end{equation}

If the series \(\sum y_n\) converges, the partial sums for the series are bounded. Therefore, the right-hand side of (2.1) is bounded for all \(k\text{;}\) there exists some \(B \in \R\) such that \(\sum_{n=1}^k y_n \leq B\) for all \(k\text{,}\) and so

\begin{equation*} \sum_{n=1}^k x_n \leq \sum_{n=1}^k y_n \leq B. \end{equation*}

Hence the partial sums for \(\sum x_n\) are also bounded. Since the partial sums are a monotone increasing sequence they are convergent. The first item is thus proved.

On the other hand if \(\sum x_n\) diverges, the sequence of partial sums must be unbounded since it is monotone increasing. That is, the partial sums for \(\sum x_n\) are eventually bigger than any real number. Putting this together with (2.1) we see that for every \(B \in \R\text{,}\) there is a \(k\) such that

\begin{equation*} B \leq \sum_{n=1}^k x_n \leq \sum_{n=1}^k y_n . \end{equation*}

Hence the partial sums for \(\sum y_n\) are also unbounded, and \(\sum y_n\) also diverges.

A useful series to use with the comparison test is the \(p\)-series 3 .


First suppose \(p \leq 1\text{.}\) As \(n \geq 1\text{,}\) we have \(\frac{1}{n^p} \geq \frac{1}{n}\text{.}\) Since \(\sum \frac{1}{n}\) diverges, \(\sum \frac{1}{n^p}\) must diverge for all \(p \leq 1\) by the comparison test.

Now suppose \(p > 1\text{.}\) We proceed as we did for the harmonic series, but instead of showing that the sequence of partial sums is unbounded, we show that it is bounded. The terms of the series are positive, so the sequence of partial sums is monotone increasing and converges if it is bounded above. Let \(s_n\) denote the \(n\)th partial sum.

\begin{equation*} \begin{aligned} s_1 & = 1 , \\ s_3 & = \left( 1 \right) + \left( \frac{1}{2^p} + \frac{1}{3^p} \right) , \\ s_7 & = \left( 1 \right) + \left( \frac{1}{2^p} + \frac{1}{3^p} \right) + \left( \frac{1}{4^p} + \frac{1}{5^p} + \frac{1}{6^p} + \frac{1}{7^p} \right) , \\ & ~~ \vdots \\ s_{2^k - 1} &= 1 + \sum_{j=1}^{k-1} \left( \sum_{m=2^j}^{2^{j+1}-1} \frac{1}{m^p} \right) . \end{aligned} \end{equation*}

Instead of estimating from below, we estimate from above. As \(p\) is positive, then \(2^p < 3^p\text{,}\) and hence \(\frac{1}{2^p} + \frac{1}{3^p} < \frac{1}{2^p} + \frac{1}{2^p}\text{.}\) Similarly, \(\frac{1}{4^p} + \frac{1}{5^p} + \frac{1}{6^p} + \frac{1}{7^p} < \frac{1}{4^p} + \frac{1}{4^p} + \frac{1}{4^p} + \frac{1}{4^p}\text{.}\) Therefore, for all \(k \geq 2\text{,}\)

\begin{equation*} \begin{split} s_{2^k-1} & = 1+ \sum_{j=1}^{k-1} \left( \sum_{m=2^{j}}^{2^{j+1}-1} \frac{1}{m^p} \right) \\ & < 1+ \sum_{j=1}^{k-1} \left( \sum_{m=2^{j}}^{2^{j+1}-1} \frac{1}{{(2^j)}^p} \right) \\ & = 1+ \sum_{j=1}^{k-1} \left( \frac{2^j}{{(2^j)}^p} \right) \\ & = 1+ \sum_{j=1}^{k-1} {\left( \frac{1}{2^{p-1}} \right)}^j . \end{split} \end{equation*}

As \(p > 1\text{,}\) then \(\frac{1}{2^{p-1}} < 1\text{.}\) Proposition 2.5.5 says that

\begin{equation*} \sum_{j=1}^\infty {\left( \frac{1}{2^{p-1}} \right)}^j \end{equation*}

converges. Thus,

\begin{equation*} s_{2^k-1} < 1+ \sum_{j=1}^{k-1} {\left( \frac{1}{2^{p-1}} \right)}^j \leq 1+ \sum_{j=1}^\infty {\left( \frac{1}{2^{p-1}} \right)}^j . \end{equation*}

For every \(n\) there is a \(k \geq 2\) such that \(n \leq 2^k-1\text{,}\) and as \(\{ s_n \}\) is a monotone sequence, \(s_n \leq s_{2^k-1}\text{.}\) So for all \(n\text{,}\)

\begin{equation*} s_n < 1+ \sum_{j=1}^\infty {\left( \frac{1}{2^{p-1}} \right)}^j \end{equation*}

Thus the sequence of partial sums is bounded, and the series converges.

Neither the \(p\)-series test nor the comparison test tell us what the sum converges to. They only tell us that a limit of the partial sums exists. For instance, while we know that \(\sum \nicefrac{1}{n^2}\) converges, it is far harder to find 4  that the limit is \(\nicefrac{\pi^2}{6}\text{.}\) If we treat \(\sum \nicefrac{1}{n^p}\) as a function of \(p\text{,}\) we get the so-called Riemann \(\zeta\) function. Understanding the behavior of this function contains one of the most famous unsolved problems in mathematics today and has applications in seemingly unrelated areas such as modern cryptography.

Example 2.5.18.

The series \(\sum \frac{1}{n^2+1}\) converges.

Proof: First, \(\frac{1}{n^2+1} < \frac{1}{n^2}\) for all \(n \in \N\text{.}\) The series \(\sum \frac{1}{n^2}\) converges by the \(p\)-series test. Therefore, by the comparison test, \(\sum \frac{1}{n^2+1}\) converges.

Subsection 2.5.6 Ratio test

Suppose \(r > 0\text{.}\) The ratio of two subsequent terms in the geometric series \(\sum r^n\) is \(\frac{r^{n+1}}{r^n} = r\text{,}\) and the series converges whenever \(r < 1\text{.}\) Just as for sequences, this fact can be generalized to more arbitrary series as long as we have such a ratio “in the limit.” We then compare the tail of a series to the geometric series.

Although the test as stated is often sufficient, it can be strengthened a bit, see Exercise 2.5.6.


If \(L > 1\text{,}\) then Lemma 2.2.12 says that the sequence \(\{ x_n \}\) diverges. Since it is a necessary condition for the convergence of series that the terms go to zero, we know that \(\sum x_n\) must diverge.

Thus suppose \(L < 1\text{.}\) We will argue that \(\sum \abs{x_n}\) must converge. The proof is similar to that of Lemma 2.2.12. Of course \(L \geq 0\text{.}\) Pick \(r\) such that \(L < r < 1\text{.}\) As \(r-L > 0\text{,}\) there exists an \(M \in \N\) such that for all \(n \geq M\text{,}\)

\begin{equation*} \abs{\frac{\abs{x_{n+1}}}{\abs{x_n}} - L} < r-L . \end{equation*}


\begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} < r . \end{equation*}

For \(n > M\) (that is for \(n \geq M+1\)), write

\begin{equation*} \abs{x_n} = \abs{x_M} \frac{\abs{x_{M+1}}}{\abs{x_{M}}} \frac{\abs{x_{M+2}}}{\abs{x_{M+1}}} \cdots \frac{\abs{x_{n}}}{\abs{x_{n-1}}} < \abs{x_M} r r \cdots r = \abs{x_M} r^{n-M} = (\abs{x_M} r^{-M}) r^n . \end{equation*}

For \(k > M\text{,}\) write the partial sum as

\begin{equation*} \begin{split} \sum_{n=1}^k \abs{x_n} & = \left(\sum_{n=1}^{M} \abs{x_n} \right) + \left(\sum_{n=M+1}^{k} \abs{x_n} \right) \\ & < \left(\sum_{n=1}^{M} \abs{x_n} \right) + \left(\sum_{n=M+1}^{k} (\abs{x_M} r^{-M}) r^n \right) \\ & = \left(\sum_{n=1}^{M} \abs{x_n} \right) + (\abs{x_M} r^{-M}) \left( \sum_{n=M+1}^{k} r^n \right) . \end{split} \end{equation*}

As \(0 < r < 1\text{,}\) the geometric series \(\sum_{n=0}^{\infty} r^n\) converges, so \(\sum_{n=M+1}^{\infty} r^n\) converges as well. We take the limit as \(k\) goes to infinity on the right-hand side above to obtain

\begin{equation*} \begin{split} \sum_{n=1}^k \abs{x_n} & < \left(\sum_{n=1}^{M} \abs{x_n} \right) + (\abs{x_M} r^{-M}) \left( \sum_{n=M+1}^{k} r^n \right) \\ & \leq \left(\sum_{n=1}^{M} \abs{x_n} \right) + (\abs{x_M} r^{-M}) \left( \sum_{n=M+1}^{\infty} r^n \right) . \end{split} \end{equation*}

The right-hand side is a number that does not depend on \(k\text{.}\) Hence the sequence of partial sums of \(\sum \abs{x_n}\) is bounded and \(\sum \abs{x_n}\) is convergent. Thus \(\sum x_n\) is absolutely convergent.

Example 2.5.20.

The series

\begin{equation*} \sum_{n=1}^\infty \frac{2^n}{n!} \end{equation*}

converges absolutely.

Proof: We write

\begin{equation*} \lim_{n\to\infty} \frac{2^{(n+1)}/(n+1)!}{2^n / n!} = \lim_{n\to\infty} \frac{2}{n+1} = 0 . \end{equation*}

Therefore, the series converges absolutely by the ratio test.

Subsection 2.5.7 Exercises

Exercise 2.5.1.

Suppose the \(k\)th partial sum of \(\displaystyle \sum_{n=1}^\infty x_n\) is \(s_k = \frac{k}{k+1}\text{.}\) Find the series, that is find \(x_n\text{,}\) prove that the series converges, and then find the limit.

Exercise 2.5.2.

Prove Proposition 2.5.5, that is for \(-1 < r < 1\) prove

\begin{equation*} \sum_{n=0}^\infty r^n = \frac{1}{1-r} . \end{equation*}

Hint: See Example 0.3.8.

Exercise 2.5.3.

Decide the convergence or divergence of the following series.

a) \(\displaystyle \sum_{n=1}^\infty \frac{3}{9n+1}\)        b) \(\displaystyle \sum_{n=1}^\infty \frac{1}{2n-1}\)        c) \(\displaystyle \sum_{n=1}^\infty \frac{{(-1)}^n}{n^2}\)        d) \(\displaystyle \sum_{n=1}^\infty \frac{1}{n(n+1)}\)        e) \(\displaystyle \sum_{n=1}^\infty n e^{-n^2}\)

Exercise 2.5.4.

  1. Prove that if \(\displaystyle \sum_{n=1}^\infty x_n\) converges, then \(\displaystyle \sum_{n=1}^\infty ( x_{2n} + x_{2n+1} )\) also converges.

  2. Find an explicit example where the converse does not hold.

Exercise 2.5.5.

For \(j=1,2,\ldots,n\text{,}\) let \(\{ x_{j,k} \}_{k=1}^\infty\) denote \(n\) sequences. Suppose that for each \(j \in \N\text{,}\)

\begin{equation*} \sum_{k=1}^\infty x_{j,k} \end{equation*}

is convergent. Prove

\begin{equation*} \sum_{j=1}^n \left( \sum_{k=1}^\infty x_{j,k} \right) = \sum_{k=1}^\infty \left( \sum_{j=1}^n x_{j,k} \right) . \end{equation*}

Exercise 2.5.6.

Prove the following stronger version of the ratio test: Let \(\sum x_n\) be a series.

  1. If there is an \(N\) and a \(\rho < 1\) such that \(\frac{\abs{x_{n+1}}}{\abs{x_n}} < \rho\) for all \(n \geq N\text{,}\) then the series converges absolutely. (Remark: Equivalently the condition can be stated as \(\limsup_{n\to\infty} \frac{\abs{x_{n+1}}}{\abs{x_n}} < 1\text{.}\))

  2. If there is an \(N\) such that \(\frac{\abs{x_{n+1}}}{\abs{x_n}} \geq 1\) for all \(n \geq N\text{,}\) then the series diverges.

Exercise 2.5.7.

(Challenging)   Suppose \(\{ x_n \}\) is a decreasing sequence and \(\sum x_n\) converges. Prove \(\displaystyle \lim_{n\to\infty} n x_n = 0\text{.}\)

Exercise 2.5.8.

Show that \(\displaystyle \sum_{n=1}^\infty \frac{{(-1)}^n}{n}\) converges. Hint: Consider the sum of two subsequent entries.

Exercise 2.5.9.

  1. Prove that if \(\sum x_n\) and \(\sum y_n\) converge absolutely, then \(\sum x_ny_n\) converges absolutely.

  2. Find an explicit example where the converse does not hold.

  3. Find an explicit example where all three series are absolutely convergent, are not just finite sums, and \((\sum x_n)(\sum y_n) \not= \sum x_ny_n\text{.}\) That is, show that series are not multiplied term-by-term.

Exercise 2.5.10.

Prove the triangle inequality for series: If \(\sum x_n\) converges absolutely, then

\begin{equation*} \abs{\sum_{n=1}^\infty x_n} \leq \sum_{n=1}^\infty \abs{x_n} . \end{equation*}

Exercise 2.5.11.

Prove the limit comparison test. That is, prove that if \(a_n > 0\) and \(b_n > 0\) for all \(n\text{,}\) and

\begin{equation*} 0 < \lim_{n\to\infty} \frac{a_n}{b_n} < \infty , \end{equation*}

then either \(\sum a_n\) and \(\sum b_n\) both converge or both diverge.

Exercise 2.5.12.

Let \(x_n := \sum_{j=1}^n \nicefrac{1}{j}\text{.}\) Show that for every \(k\text{,}\) we get \(\displaystyle \lim_{n\to\infty} \abs{x_{n+k}-x_n} = 0\text{,}\) yet \(\{ x_n \}\) is not Cauchy.

Exercise 2.5.13.

Let \(s_k\) be the \(k\)th partial sum of \(\sum x_n\text{.}\)

  1. Suppose that there exists an \(m \in \N\) such that \(\displaystyle \lim_{k\to\infty} s_{mk}\) exists and \(\lim\, x_n = 0\text{.}\) Show that \(\sum x_n\) converges.

  2. Find an example where \(\displaystyle \lim_{k\to\infty} s_{2k}\) exists and \(\lim\, x_n \not= 0\) (and therefore \(\sum x_n\) diverges).

  3. (Challenging) Find an example where \(\lim\, x_n = 0\text{,}\) and there exists a subsequence \(\{ s_{k_j} \}\) such that \(\displaystyle \lim_{j\to\infty} s_{k_j}\) exists, but \(\sum x_n\) still diverges.

Exercise 2.5.14.

Suppose \(\sum x_n\) converges and \(x_n \geq 0\) for all \(n\text{.}\) Prove that \(\sum x_n^2\) converges.

Exercise 2.5.15.

(Challenging)   Suppose \(\{ x_n\}\) is a decreasing sequence of positive numbers. The proof of convergence/divergence for the \(p\)-series generalizes. Prove the so-called Cauchy condensation principle:

\begin{equation*} \sum_{n=1}^\infty x_n \qquad \text{converges if and only if} \qquad \sum_{n=1}^\infty 2^n x_{2^n} \qquad \text{converges}. \end{equation*}

Exercise 2.5.16.

Use the Cauchy condensation principle (see Exercise 2.5.15) to decide the convergence of

a) \(\displaystyle \sum \frac{\ln n}{n^2}\)        b) \(\displaystyle \sum \frac{1}{n \ln n}\)        c) \(\displaystyle \sum \frac{1}{n {(\ln n)}^2}\)        d) \(\displaystyle \sum \frac{1}{n (\ln n ){(\ln \ln n)}^2}\)

For the series to be well-defined you need to start some of the series at \(n=2\text{.}\) Note that only the tails of some of these series satisfy the hypotheses of the principle; you should argue why that is sufficient.

Hint: Feel free to use the identity \(\ln (2^n) = n \ln 2\text{.}\)

Exercise 2.5.17.

(Challenging)   Prove Abel's theorem:

Theorem. Suppose \(\sum x_n\) is a series whose partial sums are a bounded sequence, \(\{ \lambda_n \}\) is a sequence with \(\lim \lambda_n = 0\text{,}\) and \(\sum \abs{ \lambda_{n+1} - \lambda_n }\) is convergent. Then \(\sum \lambda_n x_n\) is convergent.

The divergence of the harmonic series was known long before the theory of series was made rigorous. The proof we give is the earliest proof and was given by Nicole Oresme 2  (1323?–1382).
We have not yet defined \(x^p\) for \(x > 0\) and an arbitrary \(p \in \R\text{.}\) The definition is \(x^p := \exp ( p \ln x )\text{.}\) We will define the logarithm and the exponential in Section 5.4. For now you can just think of rational \(p\) where \(x^{k/m} = {(x^{1/m})}^{k}\text{.}\) See also Exercise 1.2.17.
Demonstration of this fact is what made the Swiss mathematician Leonhard Paul Euler 5  (1707–1783) famous.
For a higher quality printout use the PDF versions: or