Skip to main content

Section 2.2 Facts about limits of sequences

Note: 2–2.5 lectures, recursively defined sequences can safely be skipped

In this section we go over some basic results about the limits of sequences. We start by looking at how sequences interact with inequalities.

Subsection 2.2.1 Limits and inequalities

A basic lemma about limits and inequalities is the so-called squeeze lemma. It allows us to show convergence of sequences in difficult cases if we find two other simpler convergent sequences that “squeeze” the original sequence.


Let \(x := \lim\, a_n = \lim\, b_n\text{.}\) Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have that \(\abs{a_n-x} < \epsilon\text{,}\) and an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{b_n-x} < \epsilon\text{.}\) Set \(M := \max \{M_1, M_2 \}\text{.}\) Suppose \(n \geq M\text{.}\) In particular, \(x - a_n < \epsilon\text{,}\) or \(x - \epsilon < a_n\text{.}\) Similarly, \(b_n < x + \epsilon\text{.}\) Putting everything together, we find

\begin{equation*} x - \epsilon < a_n \leq x_n \leq b_n < x + \epsilon . \end{equation*}

In other words, \(-\epsilon < x_n-x < \epsilon\) or \(\abs{x_n-x} < \epsilon\text{.}\) So \(\{x_n\}\) converges to \(x\text{.}\) See Figure 2.3.

Figure 2.3. Squeeze lemma proof in picture.

Example 2.2.2.

One application of the squeeze lemma is to compute limits of sequences using limits that we already know. For example, consider the sequence \(\{ \frac{1}{n\sqrt{n}} \}\text{.}\) Since \(\sqrt{n} \geq 1\) for all \(n \in \N\text{,}\) we have

\begin{equation*} 0 \leq \frac{1}{n\sqrt{n}} \leq \frac{1}{n} \end{equation*}

for all \(n \in \N\text{.}\) We already know \(\lim \nicefrac{1}{n} = 0\text{.}\) Hence, using the constant sequence \(\{ 0 \}\) and the sequence \(\{ \nicefrac{1}{n} \}\) in the squeeze lemma, we conclude

\begin{equation*} \lim_{n\to\infty} \frac{1}{n\sqrt{n}} = 0 . \end{equation*}

Limits, when they exist, preserve non-strict inequalities.


Let \(x := \lim\, x_n\) and \(y := \lim\, y_n\text{.}\) Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_n-x} < \nicefrac{\epsilon}{2}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_n-y} < \nicefrac{\epsilon}{2}\text{.}\) In particular, for some \(n \geq \max\{ M_1, M_2 \}\text{,}\) we have \(x-x_n < \nicefrac{\epsilon}{2}\) and \(y_n-y < \nicefrac{\epsilon}{2}\text{.}\) We add these inequalities to obtain

\begin{equation*} y_n-x_n+x-y < \epsilon, \qquad \text{or} \qquad y_n-x_n < y-x+ \epsilon . \end{equation*}

Since \(x_n \leq y_n\text{,}\) we have \(0 \leq y_n-x_n\) and hence \(0 < y-x+ \epsilon\text{.}\) In other words,

\begin{equation*} x-y < \epsilon . \end{equation*}

Because \(\epsilon > 0\) was arbitrary, we obtain \(x-y \leq 0\text{.}\) Therefore, \(x \leq y\text{.}\)

The next corollary follows by using constant sequences in Lemma 2.2.3. The proof is left as an exercise.

In Lemma 2.2.3 and Corollary 2.2.4 we cannot simply replace all the non-strict inequalities with strict inequalities. For example, let \(x_n := \nicefrac{-1}{n}\) and \(y_n := \nicefrac{1}{n}\text{.}\) Then \(x_n < y_n\text{,}\) \(x_n < 0\text{,}\) and \(y_n > 0\) for all \(n\text{.}\) However, these inequalities are not preserved by the limit operation as \(\lim\, x_n = \lim\, y_n = 0\text{.}\) The moral of this example is that strict inequalities may become non-strict inequalities when limits are applied; if we know \(x_n < y_n\) for all \(n\text{,}\) we may only conclude

\begin{equation*} \lim_{n \to \infty} x_n \leq \lim_{n \to \infty} y_n . \end{equation*}

This issue is a common source of errors.

Subsection 2.2.2 Continuity of algebraic operations

Limits interact nicely with algebraic operations.


We start with i. Suppose \(\{ x_n \}\) and \(\{ y_n \}\) are convergent sequences and write \(z_n := x_n + y_n\text{.}\) Let \(x := \lim\, x_n\text{,}\) \(y := \lim\, y_n\text{,}\) and \(z := x+y\text{.}\)

Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_n - x} < \nicefrac{\epsilon}{2}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_n - y} < \nicefrac{\epsilon}{2}\text{.}\) Take \(M := \max \{ M_1, M_2 \}\text{.}\) For all \(n \geq M\text{,}\) we have

\begin{equation*} \begin{split} \abs{z_n - z} &= \abs{(x_n+y_n) - (x+y)} \\ & = \abs{x_n-x + y_n-y} \\ & \leq \abs{x_n-x} + \abs{y_n-y} \\ & < \frac{\epsilon}{2} + \frac{\epsilon}{2} = \epsilon. \end{split} \end{equation*}

Therefore i is proved. Proof of ii is almost identical and is left as an exercise.

Let us tackle iii. Suppose again that \(\{ x_n \}\) and \(\{ y_n \}\) are convergent sequences and write \(z_n := x_n y_n\text{.}\) Let \(x := \lim\, x_n\text{,}\) \(y := \lim\, y_n\text{,}\) and \(z := xy\text{.}\)

Let \(\epsilon > 0\) be given. Let \(K := \max\{ \abs{x}, \abs{y}, \nicefrac{\epsilon}{3} , 1 \}\text{.}\) Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_n - x} < \frac{\epsilon}{3K}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_n - y} < \frac{\epsilon}{3K}\text{.}\) Take \(M := \max \{ M_1, M_2 \}\text{.}\) For all \(n \geq M\text{,}\) we have

\begin{equation*} \begin{split} \abs{z_n - z} &= \abs{(x_ny_n) - (xy)} \\ & = \abs{(x_n-x+x)(y_n-y+y) - xy} \\ & = \abs{(x_n-x)y + x(y_n-y) +(x_n-x)(y_n-y)} \\ & \leq \abs{(x_n-x)y} + \abs{x(y_n - y)} + \abs{(x_n-x)(y_n-y)} \\ & = \abs{x_n -x}\abs{y} + \abs{x}\abs{y_n -y} + \abs{x_n -x}\abs{y_n -y} \\ & < \frac{\epsilon}{3K} K + K \frac{\epsilon}{3K} + \frac{\epsilon}{3K} \frac{\epsilon}{3K} \qquad \qquad \text{(now notice that } \tfrac{\epsilon}{3K} \leq 1 \text{ and } K \geq 1\text{)} \\ & \leq \frac{\epsilon}{3} + \frac{\epsilon}{3} + \frac{\epsilon}{3} = \epsilon . \end{split} \end{equation*}

Finally, we examine iv. Instead of proving iv directly, we prove the following simpler claim:

Claim: If \(\{ y_n \}\) is a convergent sequence such that \(\lim\, y_n \not= 0\) and \(y_n \not= 0\) for all \(n \in \N\text{,}\) then \(\{ \nicefrac{1}{y_n} \}\) converges and

\begin{equation*} \lim_{n\to\infty} \frac{1}{y_n} = \frac{1}{\lim\, y_n} . \end{equation*}

Once the claim is proved, we take the sequence \(\{ \nicefrac{1}{y_n} \}\text{,}\) multiply it by the sequence \(\{ x_n \}\) and apply item iii.

Proof of claim: Let \(\epsilon > 0\) be given. Let \(y := \lim\, y_n\text{.}\) As \(\abs{y} \not= 0\text{,}\) then \(\min \left\{ \abs{y}^2\frac{\epsilon}{2}, \, \frac{\abs{y}}{2} \right\} > 0\text{.}\) Find an \(M\) such that for all \(n \geq M\text{,}\) we have

\begin{equation*} \abs{y_n - y} < \min \left\{ \abs{y}^2\frac{\epsilon}{2}, \, \frac{\abs{y}}{2} \right\} . \end{equation*}

For all \(n \geq M\text{,}\) we have \(\abs{y - y_n} < \nicefrac{\abs{y}}{2}\text{,}\) and so

\begin{equation*} \abs{y} = \abs{y - y_n + y_n } \leq \abs{y - y_n} + \abs{ y_n } < \frac{\abs{y}}{2} + \abs{y_n}. \end{equation*}

Subtracting \(\nicefrac{\abs{y}}{2}\) from both sides we obtain \(\nicefrac{\abs{y}}{2} < \abs{y_n}\text{,}\) or in other words,

\begin{equation*} \frac{1}{\abs{y_n}} < \frac{2}{\abs{y}} . \end{equation*}

We finish the proof of the claim:

\begin{equation*} \begin{split} \abs{\frac{1}{y_n} - \frac{1}{y}} &= \abs{\frac{y - y_n}{y y_n}} \\ & = \frac{\abs{y - y_n}}{\abs{y} \abs{y_n}} \\ & \leq \frac{\abs{y - y_n}}{\abs{y}} \, \frac{2}{\abs{y}} \\ & < \frac{\abs{y}^2 \frac{\epsilon}{2}}{\abs{y}} \, \frac{2}{\abs{y}} = \epsilon . \end{split} \end{equation*}

And we are done.

By plugging in constant sequences, we get several easy corollaries. If \(c \in \R\) and \(\{ x_n \}\) is a convergent sequence, then for example

\begin{equation*} \lim_{n \to \infty} c x_n = c \left( \lim_{n \to \infty} x_n \right) \qquad \text{and} \qquad \lim_{n \to \infty} (c + x_n) = c + \lim_{n \to \infty} x_n . \end{equation*}

Similarly, we find such equalities for constant subtraction and division.

As we can take limits past multiplication we can show (exercise) that \(\lim\, x_n^k = {(\lim\, x_n)}^k\) for all \(k \in \N\text{.}\) That is, we can take limits past powers. Let us see if we can do the same with roots.

Of course, to even make this statement, we need to apply Corollary 2.2.4 to show that \(\lim\, x_n \geq 0\text{,}\) so that we can take the square root without worry.


Let \(\{ x_n \}\) be a convergent sequence and let \(x := \lim\, x_n\text{.}\) As we just mentioned, \(x \geq 0\text{.}\)

First suppose \(x=0\text{.}\) Let \(\epsilon > 0\) be given. Then there is an \(M\) such that for all \(n \geq M\text{,}\) we have \(x_n = \abs{x_n} < \epsilon^2\text{,}\) or in other words, \(\sqrt{x_n} < \epsilon\text{.}\) Hence,

\begin{equation*} \abs{\sqrt{x_n} - \sqrt{x}} = \sqrt{x_n} < \epsilon. \end{equation*}

Now suppose \(x > 0\) (and hence \(\sqrt{x} > 0\)).

\begin{equation*} \begin{split} \abs{\sqrt{x_n}-\sqrt{x}} &= \abs{\frac{x_n-x}{\sqrt{x_n}+\sqrt{x}}} \\ &= \frac{1}{\sqrt{x_n}+\sqrt{x}} \abs{x_n-x} \\ & \leq \frac{1}{\sqrt{x}} \abs{x_n-x} . \end{split} \end{equation*}

We leave the rest of the proof to the reader.

A similar proof works for the \(k\)th root. That is, we also obtain \(\lim\, x_n^{1/k} = {( \lim\, x_n )}^{1/k}\text{.}\) We leave this to the reader as a challenging exercise.

We may also want to take the limit past the absolute value sign. The converse of this proposition is not true, see Exercise 2.1.7 part b).


We simply note the reverse triangle inequality

\begin{equation*} \big\lvert \abs{x_n} - \abs{x} \big\rvert \leq \abs{x_n-x} . \end{equation*}

Hence if \(\abs{x_n -x}\) can be made arbitrarily small, so can \(\big\lvert \abs{x_n} - \abs{x} \big\rvert\text{.}\) Details are left to the reader.

Let us see an example putting the propositions above together. Since \(\lim \nicefrac{1}{n} = 0\text{,}\) then

\begin{equation*} \lim_{n\to \infty} \abs{\sqrt{1 + \nicefrac{1}{n}} - \nicefrac{100}{n^2}} = \abs{\sqrt{1 + (\lim \nicefrac{1}{n})} - 100 (\lim \nicefrac{1}{n})(\lim \nicefrac{1}{n})} = 1. \end{equation*}

That is, the limit on the left-hand side exists because the right-hand side exists. You really should read the equality above from right to left.

On the other hand you must apply the propositions carefully. For example, by rewriting the expression with common denominator first we find

\begin{equation*} \lim_{n\to \infty} \left( \frac{n^2}{n+1} - n \right) = -1 . \end{equation*}

However, \(\bigl\{ \frac{n^2}{n+1} \bigr\}\) and \(\{n\}\) are not convergent, so \(\bigl(\lim\, \frac{n^2}{n+1}\bigr) - \bigl(\lim\, n\bigr)\) is nonsense.

Subsection 2.2.3 Recursively defined sequences

Now that we know we can interchange limits and algebraic operations, we can compute the limits of many sequences. One such class are recursively defined sequences, that is, sequences where the next number in the sequence is computed using a formula from a fixed number of preceding elements in the sequence.

Example 2.2.8.

Let \(\{ x_n \}\) be defined by \(x_1 := 2\) and

\begin{equation*} x_{n+1} := x_n - \frac{x_n^2-2}{2x_n} . \end{equation*}

We must first find out if this sequence is well-defined; we must show we never divide by zero. Then we must find out if the sequence converges. Only then can we attempt to find the limit.

So let us prove \(x_n\) exists and \(x_n > 0\) for all \(n\) (so the sequence is well-defined and bounded below). Let us show this by induction. We know that \(x_1 = 2 > 0\text{.}\) For the induction step, suppose \(x_n > 0\text{.}\) Then

\begin{equation*} x_{n+1} = x_n - \frac{x_n^2-2}{2x_n} = \frac{2x_n^2 - x_n^2+2}{2x_n} = \frac{x_n^2+2}{2x_n} . \end{equation*}

It is always true that \(x_n^2+2 > 0\text{,}\) and as \(x_n > 0\text{,}\) then \(\frac{x_n^2+2}{2x_n} > 0\) and hence \(x_{n+1} > 0\text{.}\)

Next let us show that the sequence is monotone decreasing. If we show that \(x_n^2-2 \geq 0\) for all \(n\text{,}\) then \(x_{n+1} \leq x_n\) for all \(n\text{.}\) Obviously \(x_1^2-2 = 4-2 = 2 > 0\text{.}\) For an arbitrary \(n\text{,}\) we have

\begin{equation*} x_{n+1}^2-2 = {\left( \frac{x_n^2+2}{2x_n} \right)}^2 - 2 = \frac{x_n^4+4x_n^2+4 - 8x_n^2}{4x_n^2} = \frac{x_n^4-4x_n^2+4}{4x_n^2} = \frac{{\left( x_n^2-2 \right)}^2}{4x_n^2} . \end{equation*}

Since squares are nonnegative, \(x_{n+1}^2-2 \geq 0\) for all \(n\text{.}\) Therefore, \(\{ x_n \}\) is monotone decreasing and bounded (\(x_n > 0\) for all \(n\)), and so the limit exists. It remains to find the limit.


\begin{equation*} 2x_nx_{n+1} = x_n^2+2 . \end{equation*}

Since \(\{ x_{n+1} \}\) is the 1-tail of \(\{ x_n \}\text{,}\) it converges to the same limit. Let us define \(x := \lim\, x_n\text{.}\) Take the limit of both sides to obtain

\begin{equation*} 2x^2 = x^2+2 , \end{equation*}

or \(x^2 = 2\text{.}\) As \(x_n > 0\) for all \(n\) we get \(x \geq 0\text{,}\) and therefore \(x = \sqrt{2}\text{.}\)

You may have seen the sequence above before. It is Newton's method 1  for finding the square root of 2. This method comes up often in practice and converges very rapidly. We used the fact that \(x_1^2 -2 >0\text{,}\) although it was not strictly needed to show convergence by considering a tail of the sequence. The sequence converges as long as \(x_1 \not= 0\text{,}\) although with a negative \(x_1\) we would arrive at \(x=-\sqrt{2}\text{.}\) By replacing the 2 in the numerator we obtain the square root of any positive number. These statements are left as an exercise.

You should, however, be careful. Before taking any limits, you must make sure the sequence converges. Let us see an example.

Example 2.2.9.

Suppose \(x_1 := 1\) and \(x_{n+1} := x_n^2+x_n\text{.}\) If we blindly assumed that the limit exists (call it \(x\)), then we would get the equation \(x = x^2+x\text{,}\) from which we might conclude \(x=0\text{.}\) However, it is not hard to show that \(\{ x_n \}\) is unbounded and therefore does not converge.

The thing to notice in this example is that the method still works, but it depends on the initial value \(x_1\text{.}\) If we set \(x_1 := 0\text{,}\) then the sequence converges and the limit really is 0. An entire branch of mathematics, called dynamics, deals precisely with these issues. See Exercise 2.2.14.

Subsection 2.2.4 Some convergence tests

It is not always necessary to go back to the definition of convergence to prove that a sequence is convergent. We first give a simple convergence test. The main idea is that \(\{ x_n \}\) converges to \(x\) if and only if \(\{ \abs{ x_n - x } \}\) converges to zero.


Let \(\epsilon > 0\) be given. Note that \(a_n \geq 0\) for all \(n\text{.}\) Find an \(M \in \N\) such that for all \(n \geq M\text{,}\) we have \(a_n = \abs{a_n - 0} < \epsilon\text{.}\) Then, for all \(n \geq M\text{,}\) we have

\begin{equation*} \abs{x_n - x} \leq a_n < \epsilon . \qedhere \end{equation*}

As the proposition shows, to study when a sequence has a limit is the same as studying when another sequence goes to zero. In general, it may be hard to decide if a sequence converges, but for certain sequences there exist easy to apply tests that tell us if the sequence converges or not. Let us see one such test. First, let us compute the limit of a certain specific sequence.


First consider \(c < 1\text{.}\) As \(c > 0\text{,}\) then \(c^n > 0\) for all \(n \in \N\) by induction. As \(c < 1\text{,}\) then \(c^{n+1} < c^n\) for all \(n\text{.}\) So \(\{ c^n \}\) is a decreasing sequence that is bounded below. Hence, it is convergent. Let \(L := \lim\, c^n\text{.}\) The 1-tail \(\{ c^{n+1} \}\) also converges to \(L\text{.}\) Taking the limit of both sides of \(c^{n+1} = c \cdot c^n\text{,}\) we obtain \(L = cL\text{,}\) or \((1-c)L=0\text{.}\) It follows that \(L=0\) as \(c \not= 1\text{.}\)

Now consider \(c > 1\text{.}\) Let \(B > 0\) be arbitrary. As \(\nicefrac{1}{c} < 1\text{,}\) then \(\bigl\{ {(\nicefrac{1}{c})}^n \bigr\}\) converges to \(0\text{.}\) Hence for some large enough \(n\text{,}\) we get

\begin{equation*} \frac{1}{c^n} = {\left(\frac{1}{c}\right)}^n < \frac{1}{B} . \end{equation*}

In other words, \(c^n > B\text{,}\) and \(B\) is not an upper bound for \(\{ c^n \}\text{.}\) As \(B\) was arbitrary, \(\{ c^n \}\) is unbounded.

In the proposition above, the ratio of the \((n+1)\)th term and the \(n\)th term is \(c\text{.}\) We generalize this simple result to a larger class of sequences. The following lemma will come up again once we get to series.

If \(L\) exists, but \(L=1\text{,}\) the lemma says nothing. We cannot make any conclusion based on that information alone. For example, the sequence \(\{ \nicefrac{1}{n} \}\) converges to zero, but \(L=1\text{.}\) The constant sequence \(\{ 1 \}\) converges to 1, not zero, and \(L=1\text{.}\) The sequence \(\{ {(-1)}^n \}\) does not converge at all, and \(L=1\) as well. Finally, the sequence \(\{ n \}\) is unbounded, yet again \(L=1\text{.}\) The statement of the lemma may be strengthened somewhat, see exercises 2.2.13 and 2.3.15.


Suppose \(L < 1\text{.}\) As \(\frac{\abs{x_{n+1}}}{\abs{x_n}} \geq 0\) for all \(n\text{,}\) then \(L \geq 0\text{.}\) Pick \(r\) such that \(L < r < 1\text{.}\) We wish to compare the sequence \(\{ x_n \}\) to the sequence \(\{ r^n \}\text{.}\) The idea is that while the ratio \(\frac{\abs{x_{n+1}}}{\abs{x_n}}\) is not going to be less than \(L\) eventually, it will eventually be less than \(r\text{,}\) which is still less than 1. The intuitive idea of the proof is illustrated in Figure 2.4.

Figure 2.4. Proof of ratio test in picture. The short lines represent the ratios \(\frac{\abs{x_{n+1}}}{\abs{x_n}}\) approaching \(L\text{.}\)

As \(r-L > 0\text{,}\) there exists an \(M \in \N\) such that for all \(n \geq M\text{,}\) we have

\begin{equation*} \abs{\frac{\abs{x_{n+1}}}{\abs{x_n}} - L} < r-L . \end{equation*}

Therefore, for \(n \geq M\text{,}\)

\begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} - L < r-L \qquad \text{or} \qquad \frac{\abs{x_{n+1}}}{\abs{x_n}} < r . \end{equation*}

For \(n > M\) (that is for \(n \geq M+1\)) write

\begin{equation*} \abs{x_n} = \abs{x_M} \frac{\abs{x_{M+1}}}{\abs{x_{M}}} \frac{\abs{x_{M+2}}}{\abs{x_{M+1}}} \cdots \frac{\abs{x_{n}}}{\abs{x_{n-1}}} < \abs{x_M} r r \cdots r = \abs{x_M} r^{n-M} = (\abs{x_M} r^{-M}) r^n . \end{equation*}

The sequence \(\{ r^n \}\) converges to zero and hence \(\abs{x_M} r^{-M} r^n\) converges to zero. By Proposition 2.2.10, the \(M\)-tail of \(\{x_n\}\) converges to zero and therefore \(\{x_n\}\) converges to zero.

Now suppose \(L > 1\text{.}\) Pick \(r\) such that \(1 < r < L\text{.}\) As \(L-r > 0\text{,}\) there exists an \(M \in \N\) such that for all \(n \geq M\)

\begin{equation*} \abs{\frac{\abs{x_{n+1}}}{\abs{x_n}} - L} < L-r . \end{equation*}


\begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} > r . \end{equation*}

Again for \(n > M\text{,}\) write

\begin{equation*} \abs{x_n} = \abs{x_M} \frac{\abs{x_{M+1}}}{\abs{x_{M}}} \frac{\abs{x_{M+2}}}{\abs{x_{M+1}}} \cdots \frac{\abs{x_{n}}}{\abs{x_{n-1}}} > \abs{x_M} r r \cdots r = \abs{x_M} r^{n-M} = (\abs{x_M} r^{-M}) r^n . \end{equation*}

The sequence \(\{ r^n \}\) is unbounded (since \(r > 1\)), and so \(\{x_n\}\) cannot be bounded (if \(\abs{x_n} \leq B\) for all \(n\text{,}\) then \(r^n < \frac{B}{\abs{x_M}} r^{\:M}\) for all \(n > M\text{,}\) which is impossible). Consequently, \(\{ x_n \}\) cannot converge.

Example 2.2.13.

A simple application of the lemma above is to prove

\begin{equation*} \lim_{n\to\infty} \frac{2^n}{n!} = 0 . \end{equation*}

Proof: Compute

\begin{equation*} \frac{2^{n+1} / (n+1)!}{2^n/n!} = \frac{2^{n+1}}{2^n}\frac{n!}{(n+1)!} = \frac{2}{n+1} . \end{equation*}

It is not hard to see that \(\bigl\{ \frac{2}{n+1} \bigr\}\) converges to zero. The conclusion follows by the lemma.

Example 2.2.14.

A more complicated (and useful) application of the ratio test is to prove

\begin{equation*} \lim_{n\to\infty} n^{1/n} = 1 . \end{equation*}

Proof: Let \(\epsilon > 0\) be given. Consider the sequence \(\bigl\{ \frac{n}{{(1+\epsilon)}^n} \bigr\}\text{.}\) Compute

\begin{equation*} \frac{(n+1)/{(1+\epsilon)}^{n+1}}{n/{(1+\epsilon)}^{n}} = \frac{n+1}{n} \frac{1}{1+\epsilon} . \end{equation*}

The limit of \(\frac{n+1}{n} = 1+\frac{1}{n}\) as \(n \to \infty\) is 1, and so

\begin{equation*} \lim_{n\to \infty} \frac{(n+1)/{(1+\epsilon)}^{n+1}}{n/{(1+\epsilon)}^{n}} = \frac{1}{1+\epsilon} < 1 . \end{equation*}

Therefore, \(\bigl\{ \frac{n}{{(1+\epsilon)}^n} \bigr\}\) converges to 0. In particular, there exists an \(M\) such that for \(n \geq M\text{,}\) we have \(\frac{n}{{(1+\epsilon)}^n} < 1\text{,}\) or \(n < {(1+\epsilon)}^n\text{,}\) or \(n^{1/n} < 1+\epsilon\text{.}\) As \(n \geq 1\text{,}\) then \(n^{1/n} \geq 1\text{,}\) and so \(0 \leq n^{1/n}-1 < \epsilon\text{.}\) Consequently, \(\lim\, n^{1/n} = 1\text{.}\)

Subsection 2.2.5 Exercises

Exercise 2.2.3.

Prove that if \(\{ x_n \}\) is a convergent sequence, \(k \in \N\text{,}\) then

\begin{equation*} \lim_{n\to\infty} x_n^k = {\left( \lim_{n\to\infty} x_n \right)}^k . \end{equation*}

Hint: Use induction.

Exercise 2.2.4.

Suppose \(x_1 := \frac{1}{2}\) and \(x_{n+1} := x_n^2\text{.}\) Show that \(\{ x_n \}\) converges and find \(\lim\, x_n\text{.}\) Hint: You cannot divide by zero!

Exercise 2.2.5.

Let \(x_n := \frac{n-\cos(n)}{n}\text{.}\) Use the squeeze lemma to show that \(\{ x_n \}\) converges and find the limit.

Exercise 2.2.6.

Let \(x_n := \frac{1}{n^2}\) and \(y_n := \frac{1}{n}\text{.}\) Define \(z_n := \frac{x_n}{y_n}\) and \(w_n := \frac{y_n}{x_n}\text{.}\) Do \(\{ z_n \}\) and \(\{ w_n \}\) converge? What are the limits? Can you apply Proposition 2.2.5? Why or why not?

Exercise 2.2.7.

True or false, prove or find a counterexample. If \(\{ x_n \}\) is a sequence such that \(\{ x_n^2 \}\) converges, then \(\{ x_n \}\) converges.

Exercise 2.2.8.

Show that

\begin{equation*} \lim_{n\to\infty} \frac{n^2}{2^n} = 0 . \end{equation*}

Exercise 2.2.9.

Suppose \(\{ x_n \}\) is a sequence and suppose for some \(x \in \R\text{,}\) the limit

\begin{equation*} L := \lim_{n \to \infty} \frac{\abs{x_{n+1}-x}}{\abs{x_n-x}} \end{equation*}

exists and \(L < 1\text{.}\) Show that \(\{ x_n \}\) converges to \(x\text{.}\)

Exercise 2.2.10.

(Challenging)   Let \(\{ x_n \}\) be a convergent sequence such that \(x_n \geq 0\) and \(k \in \N\text{.}\) Then

\begin{equation*} \lim_{n\to\infty} x_n^{1/k} = {\left( \lim_{n\to\infty} x_n \right)}^{1/k} . \end{equation*}

Hint: Find an expression \(q\) such that \(\frac{x_n^{1/k}-x^{1/k}}{x_n-x} = \frac{1}{q}\text{.}\)

Exercise 2.2.11.

Let \(r > 0\text{.}\) Show that starting with an arbitrary \(x_1 \not= 0\text{,}\) the sequence defined by

\begin{equation*} x_{n+1} := x_n - \frac{x_n^2-r}{2x_n} \end{equation*}

converges to \(\sqrt{r}\) if \(x_1 > 0\) and \(-\sqrt{r}\) if \(x_1 < 0\text{.}\)

Exercise 2.2.12.

  1. Suppose \(\{ a_n \}\) is a bounded sequence and \(\{ b_n \}\) is a sequence converging to 0. Show that \(\{ a_n b_n \}\) converges to 0.

  2. Find an example where \(\{ a_n \}\) is unbounded, \(\{ b_n \}\) converges to 0, and \(\{ a_n b_n \}\) is not convergent.

  3. Find an example where \(\{ a_n \}\) is bounded, \(\{ b_n \}\) converges to some \(x \not= 0\text{,}\) and \(\{ a_n b_n \}\) is not convergent.

Exercise 2.2.13.

(Easy)   Prove the following stronger version of Lemma 2.2.12, the ratio test. Suppose \(\{ x_n \}\) is a sequence such that \(x_n \not= 0\) for all \(n\text{.}\)

  1. Prove that if there exists an \(r < 1\) and \(M \in \N\) such that for all \(n \geq M\text{,}\) we have

    \begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} \leq r , \end{equation*}

    then \(\{ x_n \}\) converges to \(0\text{.}\)

  2. Prove that if there exists an \(r > 1\) and \(M \in \N\) such that for all \(n \geq M\text{,}\) we have

    \begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} \geq r , \end{equation*}

    then \(\{ x_n \}\) is unbounded.

Exercise 2.2.14.

Suppose \(x_1 := c\) and \(x_{n+1} := x_n^2+x_n\text{.}\) Show that \(\{ x_n \}\) converges if and only if \(-1 \leq c \leq 0\text{,}\) in which case it converges to 0.

Exercise 2.2.15.

Prove \(\lim\limits_{n \to \infty} {(n^2+1)}^{1/n} = 1\text{.}\)

Exercise 2.2.16.

Prove that \(\bigl\{ {(n!)}^{1/n} \bigr\}\) is unbounded. Hint: Show that \(\left\{ \frac{C^n}{n!} \right\}\) converges to zero for all \(C > 0\text{.}\)

Named after the English physicist and mathematician Isaac Newton 2  (1642–1726/7).
For a higher quality printout use the PDF versions: or