Skip to main content

Section 2.1 Sequences and limits

Note: 2.5 lectures

Analysis is essentially about taking limits. The most basic type of a limit is a limit of a sequence of real numbers. We have already seen sequences used informally. Let us give the formal definition.

Definition 2.1.1.

A sequence (of real numbers) is a function \(x \colon \N \to \R\text{.}\) Instead of \(x(n)\text{,}\) we usually denote the \(n\)th element in the sequence by \(x_n\text{.}\) We use the notation \(\{ x_n \}\text{,}\) or more precisely

\begin{equation*} \{ x_n \}_{n=1}^\infty, \end{equation*}

to denote a sequence.

A sequence \(\{ x_n \}\) is bounded if there exists a \(B \in \R\) such that

\begin{equation*} \abs{x_n} \leq B \qquad \text{for all } n \in \N. \end{equation*}

In other words, the sequence \(\{x_n\}\) is bounded whenever the set \(\{ x_n : n \in \N \}\) is bounded, or equivalently when it is bounded as a function.

When we need to give a concrete sequence we often give each term as a formula in terms of \(n\text{.}\) For example, \(\{ \nicefrac{1}{n} \}_{n=1}^\infty\text{,}\) or simply \(\{ \nicefrac{1}{n} \}\text{,}\) stands for the sequence \(1, \nicefrac{1}{2}, \nicefrac{1}{3}, \nicefrac{1}{4}, \nicefrac{1}{5}, \ldots\text{.}\) The sequence \(\{ \nicefrac{1}{n} \}\) is a bounded sequence (\(B=1\) suffices). On the other hand the sequence \(\{ n \}\) stands for \(1,2,3,4,\ldots\text{,}\) and this sequence is not bounded (why?).

While the notation for a sequence is similar 1  to that of a set, the notions are distinct. For example, the sequence \(\{ {(-1)}^n \}\) is the sequence \(-1,1,-1,1,-1,1,\ldots\text{,}\) whereas the set of values, the range of the sequence, is just the set \(\{ -1, 1 \}\text{.}\) We can write this set as \(\{ {(-1)}^n : n \in \N \}\text{.}\) When ambiguity can arise, we use the words sequence or set to distinguish the two concepts.

Another example of a sequence is the so-called constant sequence. That is a sequence \(\{ c \} = c,c,c,c,\ldots\) consisting of a single constant \(c \in \R\) repeating indefinitely.

We now get to the idea of a limit of a sequence. We will see in Proposition 2.1.6 that the notation below is well-defined. That is, if a limit exists, then it is unique. So it makes sense to talk about the limit of a sequence.

Definition 2.1.2.

A sequence \(\{ x_n \}\) is said to converge to a number \(x \in \R\) if for every \(\epsilon > 0\text{,}\) there exists an \(M \in \N\) such that \(\abs{x_n - x} < \epsilon\) for all \(n \geq M\text{.}\) The number \(x\) is said to be the limit of \(\{ x_n \}\text{.}\) We write

\begin{equation*} \lim_{n\to \infty} x_n := x . \end{equation*}

A sequence that converges is said to be convergent. Otherwise, we say the sequence diverges or that it is divergent.

It is good to know intuitively what a limit means. It means that eventually every number in the sequence is close to the number \(x\text{.}\) More precisely, we can get arbitrarily close to the limit, provided we go far enough in the sequence. It does not mean we ever reach the limit. It is possible, and quite common, that there is no \(x_n\) in the sequence that equals the limit \(x\text{.}\) We illustrate the concept in Figure 2.1. In the figure we first think of the sequence as a graph, as it is a function of \(\N\text{.}\) Secondly we also plot it as a sequence of labeled points on the real line.


Figure 2.1. Illustration of convergence. On top, we show the first ten points of the sequence as a graph with \(M\) and the interval around the limit \(x\) marked. On bottom, the points of the same sequence are marked on the number line.

When we write \(\lim\, x_n = x\) for some real number \(x\text{,}\) we are saying two things: first, that \(\{ x_n \}\) is convergent, and second, that the limit is \(x\text{.}\)

The definition above is one of the most important definitions in analysis, and it is necessary to understand it perfectly. The key point in the definition is that given any \(\epsilon > 0\text{,}\) we can find an \(M\text{.}\) The \(M\) can depend on \(\epsilon\text{,}\) so we only pick an \(M\) once we know \(\epsilon\text{.}\) Let us illustrate convergence on a few examples.

Example 2.1.3.

The constant sequence \(1,1,1,1,\ldots\) is convergent and the limit is 1. For every \(\epsilon > 0\text{,}\) we pick \(M = 1\text{.}\)

Example 2.1.4.

Claim: The sequence \(\{ \nicefrac{1}{n} \}\) is convergent and

\begin{equation*} \lim_{n\to \infty} \frac{1}{n} = 0 . \end{equation*}

Proof: Given an \(\epsilon > 0\text{,}\) we find an \(M \in \N\) such that \(0 < \nicefrac{1}{M} < \epsilon\) (Archimedean property at work). Then for all \(n \geq M\text{,}\)

\begin{equation*} \abs{x_n - 0} = \abs{\frac{1}{n}} = \frac{1}{n} \leq \frac{1}{M} < \epsilon . \end{equation*}

Example 2.1.5.

The sequence \(\{ {(-1)}^n \}\) is divergent. Proof: If there were a limit \(x\text{,}\) then for \(\epsilon = \frac{1}{2}\) we expect an \(M\) that satisfies the definition. Suppose such an \(M\) exists. Then for an even \(n \geq M\) we compute

\begin{equation*} \nicefrac{1}{2} > \abs{x_n - x} = \abs{1 - x} \qquad \text{and} \qquad \nicefrac{1}{2} > \abs{x_{n+1} - x} = \abs{-1 - x} . \end{equation*}

But

\begin{equation*} 2 = \abs{1 - x - (-1 -x)} \leq \abs{1 - x} + \abs{-1 -x} < \nicefrac{1}{2} + \nicefrac{1}{2} = 1 , \end{equation*}

and that is a contradiction.

The proof of this proposition exhibits a useful technique in analysis. Many proofs follow the same general scheme. We want to show a certain quantity is zero. We write the quantity using the triangle inequality as two quantities, and we estimate each one by arbitrarily small numbers.

Proof.

Suppose the sequence \(\{ x_n \}\) has limits \(x\) and \(y\text{.}\) Take an arbitrary \(\epsilon > 0\text{.}\) From the definition find an \(M_1\) such that for all \(n \geq M_1\text{,}\) \(\abs{x_n-x} < \nicefrac{\epsilon}{2}\text{.}\) Similarly, find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{x_n-y} < \nicefrac{\epsilon}{2}\text{.}\) Now take an \(n\) such that \(n \geq M_1\) and also \(n \geq M_2\text{,}\) and estimate

\begin{equation*} \begin{split} \abs{y-x} & = \abs{x_n-x - (x_n -y)} \\ & \leq \abs{x_n-x} + \abs{x_n -y} \\ & < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon . \end{split} \end{equation*}

As \(\abs{y-x} < \epsilon\) for all \(\epsilon > 0\text{,}\) then \(\abs{y-x} = 0\) and \(y=x\text{.}\) Hence the limit (if it exists) is unique.

Proof.

Suppose \(\{ x_n \}\) converges to \(x\text{.}\) Thus there exists an \(M \in \N\) such that for all \(n \geq M\text{,}\) we have \(\abs{x_n - x} < 1\text{.}\) Let \(B_1 := \abs{x}+1\) and note that for \(n \geq M\text{,}\)

\begin{equation*} \begin{split} \abs{x_n} & = \abs{x_n - x + x} \\ & \leq \abs{x_n - x} + \abs{x} \\ & < 1 + \abs{x} = B_1 . \end{split} \end{equation*}

The set \(\{ \abs{x_1}, \abs{x_2}, \ldots, \abs{x_{M-1}} \}\) is a finite set and hence let

\begin{equation*} B_2 := \max \{ \abs{x_1}, \abs{x_2}, \ldots, \abs{x_{M-1}} \} . \end{equation*}

Let \(B := \max \{ B_1, B_2 \}\text{.}\) Then for all \(n \in \N\text{,}\)

\begin{equation*} \abs{x_n} \leq B. \qedhere \end{equation*}

The sequence \(\{ {(-1)}^n \}\) shows that the converse does not hold. A bounded sequence is not necessarily convergent.

Example 2.1.8.

Let us show the sequence \(\left\{ \frac{n^2+1}{n^2+n} \right\}\) converges and

\begin{equation*} \lim_{n\to\infty} \frac{n^2+1}{n^2+n} = 1 . \end{equation*}

Given \(\epsilon > 0\text{,}\) find \(M \in \N\) such that \(\frac{1}{M} < \epsilon\text{.}\) Then for all \(n \geq M\text{,}\)

\begin{equation*} \begin{split} \abs{\frac{n^2+1}{n^2+n} - 1} = \abs{\frac{n^2+1 - (n^2+n)}{n^2+n}} & = \abs{\frac{1 - n}{n^2+n}} \\ & = \frac{n-1}{n^2+n} \\ & \leq \frac{n}{n^2+n} = \frac{1}{n+1} \\ & \leq \frac{1}{n} \leq \frac{1}{M} < \epsilon . \end{split} \end{equation*}

Therefore, \(\lim \frac{n^2+1}{n^2+n} = 1\text{.}\) This example shows that sometimes to get what you want, you must throw away some information to get a simpler estimate.

Subsection 2.1.1 Monotone sequences

The simplest type of a sequence is a monotone sequence. Checking that a monotone sequence converges is as easy as checking that it is bounded. It is also easy to find the limit for a convergent monotone sequence, provided we can find the supremum or infimum of a countable set of numbers.

Definition 2.1.9.

A sequence \(\{ x_n \}\) is monotone increasing if \(x_n \leq x_{n+1}\) for all \(n \in \N\text{.}\) A sequence \(\{ x_n \}\) is monotone decreasing if \(x_n \geq x_{n+1}\) for all \(n \in \N\text{.}\) If a sequence is either monotone increasing or monotone decreasing, we can simply say the sequence is monotone. 2 

For example, \(\{ n \}\) is monotone increasing, \(\{ \nicefrac{1}{n} \}\) is monotone decreasing, the constant sequence \(\{ 1 \}\) is both monotone increasing and monotone decreasing, and \(\{ {(-1)}^n \}\) is not monotone. First few terms of a sample monotone increasing sequence are shown in Figure 2.2.


Figure 2.2. First few terms of a monotone increasing sequence as a graph.

Proof.

Consider a monotone increasing sequence \(\{ x_n \}\text{.}\) Suppose the sequence is bounded, that is, the set \(\{ x_n : n \in \N \}\) is bounded. Let

\begin{equation*} x := \sup \{ x_n : n \in \N \} . \end{equation*}

Let \(\epsilon > 0\) be arbitrary. As \(x\) is the supremum, then there must be at least one \(M \in \N\) such that \(x_{M} > x-\epsilon\) (because \(x\) is the supremum). As \(\{ x_n \}\) is monotone increasing, then it is easy to see (by induction) that \(x_n \geq x_{M}\) for all \(n \geq M\text{.}\) Hence for all \(n \geq M\text{,}\)

\begin{equation*} \abs{x_n-x} = x-x_n \leq x-x_{M} < \epsilon . \end{equation*}

Therefore, the sequence converges to \(x\text{,}\) so a bounded monotone increasing sequence converges. For the other direction, we have already proved that a convergent sequence is bounded.

The proof for monotone decreasing sequences is left as an exercise.

Example 2.1.11.

Take the sequence \(\bigl\{ \frac{1}{\sqrt{n}} \bigr\}\text{.}\)

The sequence is bounded below as \(\frac{1}{\sqrt{n}} > 0\) for all \(n \in \N\text{.}\) Let us show that it is monotone decreasing. We start with \(\sqrt{n+1} \geq \sqrt{n}\) (why is that true?). From this inequality we obtain

\begin{equation*} \frac{1}{\sqrt{n+1}} \leq \frac{1}{\sqrt{n}} . \end{equation*}

So the sequence is monotone decreasing and bounded below (hence bounded). Proposition 2.1.10 says that that the sequence is convergent and

\begin{equation*} \lim_{n\to \infty} \frac{1}{\sqrt{n}} = \inf \left\{ \frac{1}{\sqrt{n}} : n \in \N \right\} . \end{equation*}

We already know that the infimum is greater than or equal to 0, as 0 is a lower bound. Take a number \(b \geq 0\) such that \(b \leq \frac{1}{\sqrt{n}}\) for all \(n\text{.}\) We square both sides to obtain

\begin{equation*} b^2 \leq \frac{1}{n} \qquad \text{for all } n \in \N. \end{equation*}

We have seen before that this implies that \(b^2 \leq 0\) (a consequence of the Archimedean property). As \(b^2 \geq 0\) as well, we have \(b^2 = 0\) and so \(b = 0\text{.}\) Hence, \(b=0\) is the greatest lower bound, and \(\lim \frac{1}{\sqrt{n}} = 0\text{.}\)

Example 2.1.12.

A word of caution: We must show that a monotone sequence is bounded in order to use Proposition 2.1.10 to conclude a sequence converges. The sequence \(\{ 1 + \nicefrac{1}{2} + \cdots + \nicefrac{1}{n} \}\) is a monotone increasing sequence that grows very slowly. We will see, once we get to series, that this sequence has no upper bound and so does not converge. It is not at all obvious that this sequence has no upper bound.

A common example of where monotone sequences arise is the following proposition. The proof is left as an exercise.

Subsection 2.1.2 Tail of a sequence

Definition 2.1.14.

For a sequence \(\{ x_n \}\text{,}\) the \(K\)-tail (where \(K \in \N\)), or just the tail, of \(\{ x_n \}\) is the sequence starting at \(K+1\text{,}\) usually written as

\begin{equation*} \{ x_{n+K} \}_{n=1}^\infty \qquad \text{or} \qquad \{ x_n \}_{n=K+1}^\infty . \end{equation*}

For example, the \(4\)-tail of \(\{ \nicefrac{1}{n} \}\) is \(\nicefrac{1}{5}, \nicefrac{1}{6}, \nicefrac{1}{7}, \nicefrac{1}{8}, \ldots\text{.}\) The \(0\)-tail of a sequence is the sequence itself. The convergence and the limit of a sequence only depends on its tail.

Proof.

It is clear that ii implies iii. We will therefore show first that i implies ii, and then we will show that iii implies i. That is, In the process we will also show that the limits are equal.

We start with i implies ii. Suppose \(\{x_n \}\) converges to some \(x \in \R\text{.}\) Let \(K \in \N\) be arbitrary, and define \(y_n := x_{n+K}\text{.}\) We wish to show that \(\{ y_n \}\) converges to \(x\text{.}\) Given an \(\epsilon > 0\text{,}\) there exists an \(M \in \N\) such that \(\abs{x-x_n} < \epsilon\) for all \(n \geq M\text{.}\) Note that \(n \geq M\) implies \(n+K \geq M\text{.}\) Therefore, for all \(n \geq M\text{,}\) we have

\begin{equation*} \abs{x-y_n} = \abs{x-x_{n+K}} < \epsilon . \end{equation*}

Consequently, \(\{ y_n \}\) converges to \(x\text{.}\)

Let us move to iii implies i. Let \(K \in \N\) be given, define \(y_n := x_{n+K}\text{,}\) and suppose that \(\{ y_n \}\) converges to \(x \in \R\text{.}\) That is, given an \(\epsilon > 0\text{,}\) there exists an \(M' \in \N\) such that \(\abs{x-y_n} < \epsilon\) for all \(n \geq M'\text{.}\) Let \(M := M'+K\text{.}\) Then \(n \geq M\) implies \(n-K \geq M'\text{.}\) Thus, whenever \(n \geq M\text{,}\) we have

\begin{equation*} \abs{x-x_n} = \abs{x-y_{n-K}} < \epsilon. \end{equation*}

Therefore, \(\{ x_n \}\) converges to \(x\text{.}\)

At the end of the day, the limit does not care about how the sequence begins, it only cares about the tail of the sequence. The beginning of the sequence may be arbitrary.

For example, the sequence defined by \(x_n := \frac{n}{n^2+16}\) is decreasing if we start at \(n=4\) (it is increasing before). That is: \(\{ x_n \} = \nicefrac{1}{17}, \nicefrac{1}{10}, \nicefrac{3}{25}, \nicefrac{1}{8}, \nicefrac{5}{41}, \nicefrac{3}{26}, \nicefrac{7}{65}, \nicefrac{1}{10}, \nicefrac{9}{97}, \nicefrac{5}{58},\ldots\text{,}\) and

\begin{equation*} \nicefrac{1}{17} < \nicefrac{1}{10} < \nicefrac{3}{25} < \nicefrac{1}{8} > \nicefrac{5}{41} > \nicefrac{3}{26} > \nicefrac{7}{65} > \nicefrac{1}{10} > \nicefrac{9}{97} > \nicefrac{5}{58} > \ldots . \end{equation*}

If we throw away the first 3 terms and look at the 3-tail, it is decreasing. The proof is left as an exercise. Since the 3-tail is monotone and bounded below by zero, it is convergent, and therefore the sequence is convergent.

Subsection 2.1.3 Subsequences

It is useful to sometimes consider only some terms of a sequence. A subsequence of \(\{ x_n \}\) is a sequence that contains only some of the numbers from \(\{ x_n \}\) in the same order.

Definition 2.1.16.

Let \(\{ x_n \}\) be a sequence. Let \(\{ n_i \}\) be a strictly increasing sequence of natural numbers, that is, \(n_i < n_{i+1}\) for all \(i\) (in other words \(n_1 < n_2 < n_3 < \cdots\)). The sequence

\begin{equation*} \{ x_{n_i} \}_{i=1}^\infty \end{equation*}

is called a subsequence of \(\{ x_n \}\text{.}\)

So the subsequence is the sequence \(x_{n_1},x_{n_2},x_{n_3},\ldots\text{.}\) Consider the sequence \(\{ \nicefrac{1}{n} \}\text{.}\) The sequence \(\{ \nicefrac{1}{3n} \}\) is a subsequence. To see how these two sequences fit in the definition, take \(n_i := 3i\text{.}\) The numbers in the subsequence must come from the original sequence. So \(1,0,\nicefrac{1}{3},0, \nicefrac{1}{5},\ldots\) is not a subsequence of \(\{ \nicefrac{1}{n} \}\text{.}\) Similarly, order must be preserved. So the sequence \(1,\nicefrac{1}{3},\nicefrac{1}{2},\nicefrac{1}{5},\ldots\) is not a subsequence of \(\{ \nicefrac{1}{n} \}\text{.}\)

A tail of a sequence is one special type of a subsequence. For an arbitrary subsequence, we have the following proposition about convergence.

Proof.

Suppose \(\lim_{n\to \infty} x_n = x\text{.}\) So for every \(\epsilon > 0\) there is an \(M \in \N\) such that for all \(n \geq M\text{,}\)

\begin{equation*} \abs{x_n - x} < \epsilon . \end{equation*}

It is not hard to prove (do it!) by induction that \(n_i \geq i\text{.}\) Hence \(i \geq M\) implies \(n_i \geq M\text{.}\) Thus, for all \(i \geq M\text{,}\)

\begin{equation*} \abs{x_{n_i} - x} < \epsilon , \end{equation*}

and we are done.

Example 2.1.18.

Existence of a convergent subsequence does not imply convergence of the sequence itself. Take the sequence \(0,1,0,1,0,1,\ldots\text{.}\) That is, \(x_n = 0\) if \(n\) is odd, and \(x_n = 1\) if \(n\) is even. The sequence \(\{ x_n \}\) is divergent; however, the subsequence \(\{ x_{2n} \}\) converges to 1 and the subsequence \(\{ x_{2n+1} \}\) converges to 0. Compare Proposition 2.3.7.

Subsection 2.1.4 Exercises

In the following exercises, feel free to use what you know from calculus to find the limit, if it exists. But you must prove that you found the correct limit, or prove that the series is divergent.

Exercise 2.1.1.

Is the sequence \(\{ 3n \}\) bounded? Prove or disprove.

Exercise 2.1.2.

Is the sequence \(\{ n \}\) convergent? If so, what is the limit?

Exercise 2.1.3.

Is the sequence \(\left\{ \dfrac{{(-1)}^n}{2n} \right\}\) convergent? If so, what is the limit?

Exercise 2.1.4.

Is the sequence \(\{ 2^{-n} \}\) convergent? If so, what is the limit?

Exercise 2.1.5.

Is the sequence \(\left\{ \dfrac{n}{n+1} \right\}\) convergent? If so, what is the limit?

Exercise 2.1.6.

Is the sequence \(\left\{ \dfrac{n}{n^2+1} \right\}\) convergent? If so, what is the limit?

Exercise 2.1.7.

Let \(\{ x_n \}\) be a sequence.

  1. Show that \(\lim\, x_n = 0\) (that is, the limit exists and is zero) if and only if \(\lim \abs{x_n} = 0\text{.}\)

  2. Find an example such that \(\{ \abs{x_n} \}\) converges and \(\{ x_n \}\) diverges.

Exercise 2.1.8.

Is the sequence \(\left\{ \dfrac{2^n}{n!} \right\}\) convergent? If so, what is the limit?

Exercise 2.1.9.

Show that the sequence \(\left\{ \dfrac{1}{\sqrt[3]{n}} \right\}\) is monotone and bounded. Then use Proposition 2.1.10 to find the limit.

Exercise 2.1.10.

Show that the sequence \(\left\{ \dfrac{n+1}{n} \right\}\) is monotone and bounded. Then use Proposition 2.1.10 to find the limit.

Exercise 2.1.13.

Let \(\{ x_n \}\) be a convergent monotone sequence. Suppose there exists a \(k \in \N\) such that

\begin{equation*} \lim_{n\to \infty} x_n = x_k . \end{equation*}

Show that \(x_n = x_k\) for all \(n \geq k\text{.}\)

Exercise 2.1.14.

Find a convergent subsequence of the sequence \(\{ {(-1)}^n \}\text{.}\)

Exercise 2.1.15.

Let \(\{x_n\}\) be a sequence defined by

\begin{equation*} x_n := \begin{cases} n & \text{if } n \text{ is odd} , \\ \nicefrac{1}{n} & \text{if } n \text{ is even} . \end{cases} \end{equation*}
  1. Is the sequence bounded? (prove or disprove)

  2. Is there a convergent subsequence? If so, find it.

Exercise 2.1.16.

Let \(\{ x_n \}\) be a sequence. Suppose there are two convergent subsequences \(\{ x_{n_i} \}\) and \(\{ x_{m_i} \}\text{.}\) Suppose

\begin{equation*} \lim_{i\to\infty} x_{n_i} = a \qquad \text{and} \qquad \lim_{i\to\infty} x_{m_i} = b, \end{equation*}

where \(a \not= b\text{.}\) Prove that \(\{ x_n \}\) is not convergent, without using Proposition 2.1.17.

Exercise 2.1.17.

(Tricky)   Find a sequence \(\{ x_n \}\) such that for every \(y \in \R\text{,}\) there exists a subsequence \(\{ x_{n_i} \}\) converging to \(y\text{.}\)

Exercise 2.1.18.

(Easy)   Let \(\{ x_n \}\) be a sequence and \(x \in \R\text{.}\) Suppose for every \(\epsilon > 0\text{,}\) there is an \(M\) such that \(\abs{x_n-x} \leq \epsilon\) for all \(n \geq M\text{.}\) Show that \(\lim\, x_n = x\text{.}\)

Exercise 2.1.19.

(Easy)   Let \(\{ x_n \}\) be a sequence and \(x \in \R\) such that there exists a \(k \in \N\) such that for all \(n \geq k\text{,}\) \(x_n = x\text{.}\) Prove that \(\{ x_n \}\) converges to \(x\text{.}\)

Exercise 2.1.20.

Let \(\{ x_n \}\) be a sequence and define a sequence \(\{ y_n \}\) by \(y_{2k} := x_{k^2}\) and \(y_{2k-1} := x_k\) for all \(k \in \N\text{.}\) Prove that \(\{ x_n \}\) converges if and only if \(\{ y_n \}\) converges. Furthermore, prove that if they converge, then \(\lim\, x_n = \lim\, y_n\text{.}\)

Exercise 2.1.21.

Show that the 3-tail of the sequence defined by \(x_n := \frac{n}{n^2+16}\) is monotone decreasing. Hint: Suppose \(n \geq m \geq 4\) and consider the numerator of the expression \(x_n-x_m\text{.}\)

Exercise 2.1.22.

Suppose that \(\{ x_n \}\) is a sequence such that the subsequences \(\{ x_{2n} \}\text{,}\) \(\{ x_{2n-1} \}\text{,}\) and \(\{ x_{3n} \}\) all converge. Show that \(\{ x_n \}\) is convergent.

Exercise 2.1.23.

Suppose that \(\{ x_n \}\) is a monotone increasing sequence that has a convergent subsequence. Show that \(\{ x_n \}\) is convergent. Note: So Proposition 2.1.17 is an “if and only if” for monotone sequences.

[BS] use the notation \((x_n)\) to denote a sequence instead of \(\{ x_n \}\text{,}\) which is what [R2] uses. Both are common.
Some authors use the word monotonic.
For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf