Section 2.2 Facts about limits of sequences
Note: 2–2.5 lectures, recursively defined sequences can safely be skipped
In this section we go over some basic results about the limits of sequences. We start by looking at how sequences interact with inequalities.
Subsection 2.2.1 Limits and inequalities
A basic lemma about limits and inequalities is the socalled squeeze lemma. It allows us to show convergence of sequences in difficult cases if we find two other simpler convergent sequences that “squeeze” the original sequence.
Lemma 2.2.1. Squeeze lemma.
Let \(\{ a_n \}\text{,}\) \(\{ b_n \}\text{,}\) and \(\{ x_n \}\) be sequences such that
Suppose \(\{ a_n \}\) and \(\{ b_n \}\) converge and
Then \(\{ x_n \}\) converges and
Proof.
Let \(x := \lim\, a_n = \lim\, b_n\text{.}\) Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have that \(\abs{a_nx} < \epsilon\text{,}\) and an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{b_nx} < \epsilon\text{.}\) Set \(M := \max \{M_1, M_2 \}\text{.}\) Suppose \(n \geq M\text{.}\) In particular, \(x  a_n < \epsilon\text{,}\) or \(x  \epsilon < a_n\text{.}\) Similarly, \(b_n < x + \epsilon\text{.}\) Putting everything together, we find
In other words, \(\epsilon < x_nx < \epsilon\) or \(\abs{x_nx} < \epsilon\text{.}\) So \(\{x_n\}\) converges to \(x\text{.}\) See Figure 2.3.
Example 2.2.2.
One application of the squeeze lemma is to compute limits of sequences using limits that we already know. For example, consider the sequence \(\{ \frac{1}{n\sqrt{n}} \}\text{.}\) Since \(\sqrt{n} \geq 1\) for all \(n \in \N\text{,}\) we have
for all \(n \in \N\text{.}\) We already know \(\lim \nicefrac{1}{n} = 0\text{.}\) Hence, using the constant sequence \(\{ 0 \}\) and the sequence \(\{ \nicefrac{1}{n} \}\) in the squeeze lemma, we conclude
Limits, when they exist, preserve nonstrict inequalities.
Lemma 2.2.3.
Let \(\{ x_n \}\) and \(\{ y_n \}\) be convergent sequences and
for all \(n \in \N\text{.}\) Then
Proof.
Let \(x := \lim\, x_n\) and \(y := \lim\, y_n\text{.}\) Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_nx} < \nicefrac{\epsilon}{2}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_ny} < \nicefrac{\epsilon}{2}\text{.}\) In particular, for some \(n \geq \max\{ M_1, M_2 \}\text{,}\) we have \(xx_n < \nicefrac{\epsilon}{2}\) and \(y_ny < \nicefrac{\epsilon}{2}\text{.}\) We add these inequalities to obtain
Since \(x_n \leq y_n\text{,}\) we have \(0 \leq y_nx_n\) and hence \(0 < yx+ \epsilon\text{.}\) In other words,
Because \(\epsilon > 0\) was arbitrary, we obtain \(xy \leq 0\text{.}\) Therefore, \(x \leq y\text{.}\)
The next corollary follows by using constant sequences in Lemma 2.2.3. The proof is left as an exercise.
Corollary 2.2.4.

If \(\{ x_n \}\) is a convergent sequence such that \(x_n \geq 0\text{,}\) then
\begin{equation*} \lim_{n\to\infty} x_n \geq 0. \end{equation*} 
Let \(a,b \in \R\) and let \(\{ x_n \}\) be a convergent sequence such that
\begin{equation*} a \leq x_n \leq b , \end{equation*}for all \(n \in \N\text{.}\) Then
\begin{equation*} a \leq \lim_{n\to\infty} x_n \leq b. \end{equation*}
In Lemma 2.2.3 and Corollary 2.2.4 we cannot simply replace all the nonstrict inequalities with strict inequalities. For example, let \(x_n := \nicefrac{1}{n}\) and \(y_n := \nicefrac{1}{n}\text{.}\) Then \(x_n < y_n\text{,}\) \(x_n < 0\text{,}\) and \(y_n > 0\) for all \(n\text{.}\) However, these inequalities are not preserved by the limit operation as \(\lim\, x_n = \lim\, y_n = 0\text{.}\) The moral of this example is that strict inequalities may become nonstrict inequalities when limits are applied; if we know \(x_n < y_n\) for all \(n\text{,}\) we may only conclude
This issue is a common source of errors.
Subsection 2.2.2 Continuity of algebraic operations
Limits interact nicely with algebraic operations.
Proposition 2.2.5.
Let \(\{ x_n \}\) and \(\{ y_n \}\) be convergent sequences.

The sequence \(\{ z_n \}\text{,}\) where \(z_n := x_n + y_n\text{,}\) converges and
\begin{equation*} \lim_{n \to \infty} (x_n + y_n) = \lim_{n \to \infty} z_n = \lim_{n \to \infty} x_n + \lim_{n \to \infty} y_n . \end{equation*} 
The sequence \(\{ z_n \}\text{,}\) where \(z_n := x_n  y_n\text{,}\) converges and
\begin{equation*} \lim_{n \to \infty} (x_n  y_n) = \lim_{n \to \infty} z_n = \lim_{n \to \infty} x_n  \lim_{n \to \infty} y_n . \end{equation*} 
The sequence \(\{ z_n \}\text{,}\) where \(z_n := x_n y_n\text{,}\) converges and
\begin{equation*} \lim_{n \to \infty} (x_n y_n) = \lim_{n \to \infty} z_n = \left( \lim_{n \to \infty} x_n \right) \left( \lim_{n \to \infty} y_n \right) . \end{equation*} 
If \(\lim\, y_n \not= 0\) and \(y_n \not= 0\) for all \(n \in \N\text{,}\) then the sequence \(\{ z_n \}\text{,}\) where \(z_n := \dfrac{x_n}{y_n}\text{,}\) converges and
\begin{equation*} \lim_{n \to \infty} \frac{x_n}{y_n} = \lim_{n \to \infty} z_n = \frac{\lim\, x_n}{\lim\, y_n} . \end{equation*}
Proof.
We start with i. Suppose \(\{ x_n \}\) and \(\{ y_n \}\) are convergent sequences and write \(z_n := x_n + y_n\text{.}\) Let \(x := \lim\, x_n\text{,}\) \(y := \lim\, y_n\text{,}\) and \(z := x+y\text{.}\)
Let \(\epsilon > 0\) be given. Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_n  x} < \nicefrac{\epsilon}{2}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_n  y} < \nicefrac{\epsilon}{2}\text{.}\) Take \(M := \max \{ M_1, M_2 \}\text{.}\) For all \(n \geq M\text{,}\) we have
Therefore i is proved. Proof of ii is almost identical and is left as an exercise.
Let us tackle iii. Suppose again that \(\{ x_n \}\) and \(\{ y_n \}\) are convergent sequences and write \(z_n := x_n y_n\text{.}\) Let \(x := \lim\, x_n\text{,}\) \(y := \lim\, y_n\text{,}\) and \(z := xy\text{.}\)
Let \(\epsilon > 0\) be given. Let \(K := \max\{ \abs{x}, \abs{y}, \nicefrac{\epsilon}{3} , 1 \}\text{.}\) Find an \(M_1\) such that for all \(n \geq M_1\text{,}\) we have \(\abs{x_n  x} < \frac{\epsilon}{3K}\text{.}\) Find an \(M_2\) such that for all \(n \geq M_2\text{,}\) we have \(\abs{y_n  y} < \frac{\epsilon}{3K}\text{.}\) Take \(M := \max \{ M_1, M_2 \}\text{.}\) For all \(n \geq M\text{,}\) we have
Finally, we examine iv. Instead of proving iv directly, we prove the following simpler claim:
Claim: If \(\{ y_n \}\) is a convergent sequence such that \(\lim\, y_n \not= 0\) and \(y_n \not= 0\) for all \(n \in \N\text{,}\) then \(\{ \nicefrac{1}{y_n} \}\) converges and
Once the claim is proved, we take the sequence \(\{ \nicefrac{1}{y_n} \}\text{,}\) multiply it by the sequence \(\{ x_n \}\) and apply item iii.
Proof of claim: Let \(\epsilon > 0\) be given. Let \(y := \lim\, y_n\text{.}\) As \(\abs{y} \not= 0\text{,}\) then \(\min \left\{ \abs{y}^2\frac{\epsilon}{2}, \, \frac{\abs{y}}{2} \right\} > 0\text{.}\) Find an \(M\) such that for all \(n \geq M\text{,}\) we have
For all \(n \geq M\text{,}\) we have \(\abs{y  y_n} < \nicefrac{\abs{y}}{2}\text{,}\) and so
Subtracting \(\nicefrac{\abs{y}}{2}\) from both sides we obtain \(\nicefrac{\abs{y}}{2} < \abs{y_n}\text{,}\) or in other words,
We finish the proof of the claim:
And we are done.
By plugging in constant sequences, we get several easy corollaries. If \(c \in \R\) and \(\{ x_n \}\) is a convergent sequence, then for example
Similarly, we find such equalities for constant subtraction and division.
As we can take limits past multiplication we can show (exercise) that \(\lim\, x_n^k = {(\lim\, x_n)}^k\) for all \(k \in \N\text{.}\) That is, we can take limits past powers. Let us see if we can do the same with roots.
Proposition 2.2.6.
Let \(\{ x_n \}\) be a convergent sequence such that \(x_n \geq 0\text{.}\) Then
Of course, to even make this statement, we need to apply Corollary 2.2.4 to show that \(\lim\, x_n \geq 0\text{,}\) so that we can take the square root without worry.
Proof.
Let \(\{ x_n \}\) be a convergent sequence and let \(x := \lim\, x_n\text{.}\) As we just mentioned, \(x \geq 0\text{.}\)
First suppose \(x=0\text{.}\) Let \(\epsilon > 0\) be given. Then there is an \(M\) such that for all \(n \geq M\text{,}\) we have \(x_n = \abs{x_n} < \epsilon^2\text{,}\) or in other words, \(\sqrt{x_n} < \epsilon\text{.}\) Hence,
Now suppose \(x > 0\) (and hence \(\sqrt{x} > 0\)).
We leave the rest of the proof to the reader.
A similar proof works for the \(k\)th root. That is, we also obtain \(\lim\, x_n^{1/k} = {( \lim\, x_n )}^{1/k}\text{.}\) We leave this to the reader as a challenging exercise.
We may also want to take the limit past the absolute value sign. The converse of this proposition is not true, see Exercise 2.1.7 part b).
Proposition 2.2.7.
If \(\{ x_n \}\) is a convergent sequence, then \(\{ \abs{x_n} \}\) is convergent and
Proof.
We simply note the reverse triangle inequality
Hence if \(\abs{x_n x}\) can be made arbitrarily small, so can \(\big\lvert \abs{x_n}  \abs{x} \big\rvert\text{.}\) Details are left to the reader.
Let us see an example putting the propositions above together. Since \(\lim \nicefrac{1}{n} = 0\text{,}\) then
That is, the limit on the lefthand side exists because the righthand side exists. You really should read the equality above from right to left.
On the other hand you must apply the propositions carefully. For example, by rewriting the expression with common denominator first we find
However, \(\bigl\{ \frac{n^2}{n+1} \bigr\}\) and \(\{n\}\) are not convergent, so \(\bigl(\lim\, \frac{n^2}{n+1}\bigr)  \bigl(\lim\, n\bigr)\) is nonsense.
Subsection 2.2.3 Recursively defined sequences
Now that we know we can interchange limits and algebraic operations, we can compute the limits of many sequences. One such class are recursively defined sequences, that is, sequences where the next number in the sequence is computed using a formula from a fixed number of preceding elements in the sequence.
Example 2.2.8.
Let \(\{ x_n \}\) be defined by \(x_1 := 2\) and
We must first find out if this sequence is welldefined; we must show we never divide by zero. Then we must find out if the sequence converges. Only then can we attempt to find the limit.
So let us prove \(x_n\) exists and \(x_n > 0\) for all \(n\) (so the sequence is welldefined and bounded below). Let us show this by induction. We know that \(x_1 = 2 > 0\text{.}\) For the induction step, suppose \(x_n > 0\text{.}\) Then
It is always true that \(x_n^2+2 > 0\text{,}\) and as \(x_n > 0\text{,}\) then \(\frac{x_n^2+2}{2x_n} > 0\) and hence \(x_{n+1} > 0\text{.}\)
Next let us show that the sequence is monotone decreasing. If we show that \(x_n^22 \geq 0\) for all \(n\text{,}\) then \(x_{n+1} \leq x_n\) for all \(n\text{.}\) Obviously \(x_1^22 = 42 = 2 > 0\text{.}\) For an arbitrary \(n\text{,}\) we have
Since squares are nonnegative, \(x_{n+1}^22 \geq 0\) for all \(n\text{.}\) Therefore, \(\{ x_n \}\) is monotone decreasing and bounded (\(x_n > 0\) for all \(n\)), and so the limit exists. It remains to find the limit.
Write
Since \(\{ x_{n+1} \}\) is the 1tail of \(\{ x_n \}\text{,}\) it converges to the same limit. Let us define \(x := \lim\, x_n\text{.}\) Take the limit of both sides to obtain
or \(x^2 = 2\text{.}\) As \(x_n > 0\) for all \(n\) we get \(x \geq 0\text{,}\) and therefore \(x = \sqrt{2}\text{.}\)
You may have seen the sequence above before. It is Newton's method^{ 1 } for finding the square root of 2. This method comes up often in practice and converges very rapidly. We used the fact that \(x_1^2 2 >0\text{,}\) although it was not strictly needed to show convergence by considering a tail of the sequence. The sequence converges as long as \(x_1 \not= 0\text{,}\) although with a negative \(x_1\) we would arrive at \(x=\sqrt{2}\text{.}\) By replacing the 2 in the numerator we obtain the square root of any positive number. These statements are left as an exercise.
You should, however, be careful. Before taking any limits, you must make sure the sequence converges. Let us see an example.
Example 2.2.9.
Suppose \(x_1 := 1\) and \(x_{n+1} := x_n^2+x_n\text{.}\) If we blindly assumed that the limit exists (call it \(x\)), then we would get the equation \(x = x^2+x\text{,}\) from which we might conclude \(x=0\text{.}\) However, it is not hard to show that \(\{ x_n \}\) is unbounded and therefore does not converge.
The thing to notice in this example is that the method still works, but it depends on the initial value \(x_1\text{.}\) If we set \(x_1 := 0\text{,}\) then the sequence converges and the limit really is 0. An entire branch of mathematics, called dynamics, deals precisely with these issues. See Exercise 2.2.14.
Subsection 2.2.4 Some convergence tests
It is not always necessary to go back to the definition of convergence to prove that a sequence is convergent. We first give a simple convergence test. The main idea is that \(\{ x_n \}\) converges to \(x\) if and only if \(\{ \abs{ x_n  x } \}\) converges to zero.
Proposition 2.2.10.
Let \(\{ x_n \}\) be a sequence. Suppose there is an \(x \in \R\) and a convergent sequence \(\{ a_n \}\) such that
and
Then \(\{ x_n \}\) converges and \(\lim\, x_n = x\text{.}\)
Proof.
Let \(\epsilon > 0\) be given. Note that \(a_n \geq 0\) for all \(n\text{.}\) Find an \(M \in \N\) such that for all \(n \geq M\text{,}\) we have \(a_n = \abs{a_n  0} < \epsilon\text{.}\) Then, for all \(n \geq M\text{,}\) we have
As the proposition shows, to study when a sequence has a limit is the same as studying when another sequence goes to zero. In general, it may be hard to decide if a sequence converges, but for certain sequences there exist easy to apply tests that tell us if the sequence converges or not. Let us see one such test. First, let us compute the limit of a certain specific sequence.
Proposition 2.2.11.
Let \(c > 0\text{.}\)

If \(c < 1\text{,}\) then
\begin{equation*} \lim_{n\to\infty} c^n = 0. \end{equation*} If \(c > 1\text{,}\) then \(\{ c^n \}\) is unbounded.
Proof.
First consider \(c < 1\text{.}\) As \(c > 0\text{,}\) then \(c^n > 0\) for all \(n \in \N\) by induction. As \(c < 1\text{,}\) then \(c^{n+1} < c^n\) for all \(n\text{.}\) So \(\{ c^n \}\) is a decreasing sequence that is bounded below. Hence, it is convergent. Let \(L := \lim\, c^n\text{.}\) The 1tail \(\{ c^{n+1} \}\) also converges to \(L\text{.}\) Taking the limit of both sides of \(c^{n+1} = c \cdot c^n\text{,}\) we obtain \(L = cL\text{,}\) or \((1c)L=0\text{.}\) It follows that \(L=0\) as \(c \not= 1\text{.}\)
Now consider \(c > 1\text{.}\) Let \(B > 0\) be arbitrary. As \(\nicefrac{1}{c} < 1\text{,}\) then \(\bigl\{ {(\nicefrac{1}{c})}^n \bigr\}\) converges to \(0\text{.}\) Hence for some large enough \(n\text{,}\) we get
In other words, \(c^n > B\text{,}\) and \(B\) is not an upper bound for \(\{ c^n \}\text{.}\) As \(B\) was arbitrary, \(\{ c^n \}\) is unbounded.
In the proposition above, the ratio of the \((n+1)\)th term and the \(n\)th term is \(c\text{.}\) We generalize this simple result to a larger class of sequences. The following lemma will come up again once we get to series.
Lemma 2.2.12. Ratio test for sequences.
Let \(\{ x_n \}\) be a sequence such that \(x_n \not= 0\) for all \(n\) and such that the limit
If \(L < 1\text{,}\) then \(\{ x_n \}\) converges and \(\lim\, x_n = 0\text{.}\)
If \(L > 1\text{,}\) then \(\{ x_n \}\) is unbounded (hence diverges).
If \(L\) exists, but \(L=1\text{,}\) the lemma says nothing. We cannot make any conclusion based on that information alone. For example, the sequence \(\{ \nicefrac{1}{n} \}\) converges to zero, but \(L=1\text{.}\) The constant sequence \(\{ 1 \}\) converges to 1, not zero, and \(L=1\text{.}\) The sequence \(\{ {(1)}^n \}\) does not converge at all, and \(L=1\) as well. Finally, the sequence \(\{ n \}\) is unbounded, yet again \(L=1\text{.}\) The statement of the lemma may be strengthened somewhat, see exercises 2.2.13 and 2.3.15.
Proof.
Suppose \(L < 1\text{.}\) As \(\frac{\abs{x_{n+1}}}{\abs{x_n}} \geq 0\) for all \(n\text{,}\) then \(L \geq 0\text{.}\) Pick \(r\) such that \(L < r < 1\text{.}\) We wish to compare the sequence \(\{ x_n \}\) to the sequence \(\{ r^n \}\text{.}\) The idea is that while the ratio \(\frac{\abs{x_{n+1}}}{\abs{x_n}}\) is not going to be less than \(L\) eventually, it will eventually be less than \(r\text{,}\) which is still less than 1. The intuitive idea of the proof is illustrated in Figure 2.4.
As \(rL > 0\text{,}\) there exists an \(M \in \N\) such that for all \(n \geq M\text{,}\) we have
Therefore, for \(n \geq M\text{,}\)
For \(n > M\) (that is for \(n \geq M+1\)) write
The sequence \(\{ r^n \}\) converges to zero and hence \(\abs{x_M} r^{M} r^n\) converges to zero. By Proposition 2.2.10, the \(M\)tail of \(\{x_n\}\) converges to zero and therefore \(\{x_n\}\) converges to zero.
Now suppose \(L > 1\text{.}\) Pick \(r\) such that \(1 < r < L\text{.}\) As \(Lr > 0\text{,}\) there exists an \(M \in \N\) such that for all \(n \geq M\)
Therefore,
Again for \(n > M\text{,}\) write
The sequence \(\{ r^n \}\) is unbounded (since \(r > 1\)), and so \(\{x_n\}\) cannot be bounded (if \(\abs{x_n} \leq B\) for all \(n\text{,}\) then \(r^n < \frac{B}{\abs{x_M}} r^{\:M}\) for all \(n > M\text{,}\) which is impossible). Consequently, \(\{ x_n \}\) cannot converge.
Example 2.2.13.
A simple application of the lemma above is to prove
Proof: Compute
It is not hard to see that \(\bigl\{ \frac{2}{n+1} \bigr\}\) converges to zero. The conclusion follows by the lemma.
Example 2.2.14.
A more complicated (and useful) application of the ratio test is to prove
Proof: Let \(\epsilon > 0\) be given. Consider the sequence \(\bigl\{ \frac{n}{{(1+\epsilon)}^n} \bigr\}\text{.}\) Compute
The limit of \(\frac{n+1}{n} = 1+\frac{1}{n}\) as \(n \to \infty\) is 1, and so
Therefore, \(\bigl\{ \frac{n}{{(1+\epsilon)}^n} \bigr\}\) converges to 0. In particular, there exists an \(M\) such that for \(n \geq M\text{,}\) we have \(\frac{n}{{(1+\epsilon)}^n} < 1\text{,}\) or \(n < {(1+\epsilon)}^n\text{,}\) or \(n^{1/n} < 1+\epsilon\text{.}\) As \(n \geq 1\text{,}\) then \(n^{1/n} \geq 1\text{,}\) and so \(0 \leq n^{1/n}1 < \epsilon\text{.}\) Consequently, \(\lim\, n^{1/n} = 1\text{.}\)
Subsection 2.2.5 Exercises
Exercise 2.2.1.
Prove Corollary 2.2.4. Hint: Use constant sequences and Lemma 2.2.3.
Exercise 2.2.2.
Prove part ii of Proposition 2.2.5.
Exercise 2.2.3.
Prove that if \(\{ x_n \}\) is a convergent sequence, \(k \in \N\text{,}\) then
Hint: Use induction.
Exercise 2.2.4.
Suppose \(x_1 := \frac{1}{2}\) and \(x_{n+1} := x_n^2\text{.}\) Show that \(\{ x_n \}\) converges and find \(\lim\, x_n\text{.}\) Hint: You cannot divide by zero!
Exercise 2.2.5.
Let \(x_n := \frac{n\cos(n)}{n}\text{.}\) Use the squeeze lemma to show that \(\{ x_n \}\) converges and find the limit.
Exercise 2.2.6.
Let \(x_n := \frac{1}{n^2}\) and \(y_n := \frac{1}{n}\text{.}\) Define \(z_n := \frac{x_n}{y_n}\) and \(w_n := \frac{y_n}{x_n}\text{.}\) Do \(\{ z_n \}\) and \(\{ w_n \}\) converge? What are the limits? Can you apply Proposition 2.2.5? Why or why not?
Exercise 2.2.7.
True or false, prove or find a counterexample. If \(\{ x_n \}\) is a sequence such that \(\{ x_n^2 \}\) converges, then \(\{ x_n \}\) converges.
Exercise 2.2.8.
Show that
Exercise 2.2.9.
Suppose \(\{ x_n \}\) is a sequence and suppose for some \(x \in \R\text{,}\) the limit
exists and \(L < 1\text{.}\) Show that \(\{ x_n \}\) converges to \(x\text{.}\)
Exercise 2.2.10.
(Challenging) Let \(\{ x_n \}\) be a convergent sequence such that \(x_n \geq 0\) and \(k \in \N\text{.}\) Then
Hint: Find an expression \(q\) such that \(\frac{x_n^{1/k}x^{1/k}}{x_nx} = \frac{1}{q}\text{.}\)
Exercise 2.2.11.
Let \(r > 0\text{.}\) Show that starting with an arbitrary \(x_1 \not= 0\text{,}\) the sequence defined by
converges to \(\sqrt{r}\) if \(x_1 > 0\) and \(\sqrt{r}\) if \(x_1 < 0\text{.}\)
Exercise 2.2.12.
Suppose \(\{ a_n \}\) is a bounded sequence and \(\{ b_n \}\) is a sequence converging to 0. Show that \(\{ a_n b_n \}\) converges to 0.
Find an example where \(\{ a_n \}\) is unbounded, \(\{ b_n \}\) converges to 0, and \(\{ a_n b_n \}\) is not convergent.
Find an example where \(\{ a_n \}\) is bounded, \(\{ b_n \}\) converges to some \(x \not= 0\text{,}\) and \(\{ a_n b_n \}\) is not convergent.
Exercise 2.2.13.
(Easy) Prove the following stronger version of Lemma 2.2.12, the ratio test. Suppose \(\{ x_n \}\) is a sequence such that \(x_n \not= 0\) for all \(n\text{.}\)

Prove that if there exists an \(r < 1\) and \(M \in \N\) such that for all \(n \geq M\text{,}\) we have
\begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} \leq r , \end{equation*}then \(\{ x_n \}\) converges to \(0\text{.}\)

Prove that if there exists an \(r > 1\) and \(M \in \N\) such that for all \(n \geq M\text{,}\) we have
\begin{equation*} \frac{\abs{x_{n+1}}}{\abs{x_n}} \geq r , \end{equation*}then \(\{ x_n \}\) is unbounded.
Exercise 2.2.14.
Suppose \(x_1 := c\) and \(x_{n+1} := x_n^2+x_n\text{.}\) Show that \(\{ x_n \}\) converges if and only if \(1 \leq c \leq 0\text{,}\) in which case it converges to 0.
Exercise 2.2.15.
Prove \(\lim\limits_{n \to \infty} {(n^2+1)}^{1/n} = 1\text{.}\)
Exercise 2.2.16.
Prove that \(\bigl\{ {(n!)}^{1/n} \bigr\}\) is unbounded. Hint: Show that \(\left\{ \frac{C^n}{n!} \right\}\) converges to zero for all \(C > 0\text{.}\)
https://en.wikipedia.org/wiki/Isaac_Newton