## Section6.2Interchange of limits

Note: 1–2.5 lectures, subsections on derivatives and power series (which requires Section 2.6) optional.

Large parts of modern analysis deal mainly with the question of the interchange of two limiting operations. When we have a chain of two limits, we cannot always just swap the limits. For instance,

\begin{equation*} 0 = \lim_{n\to\infty} \left( \lim_{k\to\infty} \frac{n}{n + k} \right) \not= \lim_{k\to\infty} \left( \lim_{n\to\infty} \frac{n}{n + k} \right) = 1 . \end{equation*}

When talking about sequences of functions, interchange of limits comes up quite often. We look at several instances: continuity of the limit, the integral of the limit, the derivative of the limit, and the convergence of power series.

### Subsection6.2.1Continuity of the limit

If we have a sequence $$\{ f_n \}$$ of continuous functions, is the limit continuous? Suppose $$f$$ is the (pointwise) limit of $$\{ f_n \}\text{.}$$ If $$\lim\, x_k = x\text{,}$$ we are interested in the following interchange of limits, where the equality to prove is marked with a question mark. Equality is not always true, in fact, the limits to the left of the question mark might not even exist.

\begin{equation*} \lim_{k \to \infty} f(x_k) = \lim_{k \to \infty} \Bigl( \lim_{n \to \infty} f_n(x_k) \Bigr) \overset{?}{=} \lim_{n \to \infty} \Bigl( \lim_{k \to \infty} f_n(x_k) \Bigr) = \lim_{n \to \infty} f_n(x) = f(x) . \end{equation*}

We wish to find conditions on the sequence $$\{ f_n \}$$ so that the equation above holds. If we only require pointwise convergence, then the limit of a sequence of functions need not be continuous, and the equation above need not hold.

#### Example6.2.1.

Define $$f_n \colon [0,1] \to \R$$ as

\begin{equation*} f_n(x) := \begin{cases} 1-nx & \text{if } x < \nicefrac{1}{n},\\ 0 & \text{if } x \geq \nicefrac{1}{n}. \end{cases} \end{equation*}

See Figure 6.4. Figure 6.4. Graph of $$f_n(x)\text{.}$$

Each function $$f_n$$ is continuous. Fix an $$x \in (0,1]\text{.}$$ If $$n \geq \nicefrac{1}{x}\text{,}$$ then $$x \geq \nicefrac{1}{n}\text{.}$$ Therefore for $$n \geq \nicefrac{1}{x}\text{,}$$ we have $$f_n(x) = 0\text{,}$$ and so

\begin{equation*} \lim_{n \to \infty} f_n(x) = 0. \end{equation*}

On the other hand, if $$x=0\text{,}$$ then

\begin{equation*} \lim_{n \to \infty} f_n(0) = \lim_{n \to \infty} 1 = 1. \end{equation*}

Thus the pointwise limit of $$f_n$$ is the function $$f \colon [0,1] \to \R$$ defined by

\begin{equation*} f(x) := \begin{cases} 1 & \text{if } x = 0,\\ 0 & \text{if } x > 0. \end{cases} \end{equation*}

The function $$f$$ is not continuous at 0.

If we, however, require the convergence to be uniform, the limits can be interchanged.

#### Proof.

Let $$x \in S$$ be fixed. Let $$\{ x_n \}$$ be a sequence in $$S$$ converging to $$x\text{.}$$

Let $$\epsilon > 0$$ be given. As $$\{ f_k \}$$ converges uniformly to $$f\text{,}$$ we find a $$k \in \N$$ such that

\begin{equation*} \abs{f_k(y)-f(y)} < \nicefrac{\epsilon}{3} \end{equation*}

for all $$y \in S\text{.}$$ As $$f_k$$ is continuous at $$x\text{,}$$ we find an $$N \in \N$$ such that for all $$m \geq N\text{,}$$

\begin{equation*} \abs{f_k(x_m)-f_k(x)} < \nicefrac{\epsilon}{3} . \end{equation*}

Thus for all $$m \geq N\text{,}$$

\begin{equation*} \begin{split} \abs{f(x_m)-f(x)} & = \abs{f(x_m)-f_k(x_m)+f_k(x_m)-f_k(x)+f_k(x)-f(x)} \\ & \leq \abs{f(x_m)-f_k(x_m)}+ \abs{f_k(x_m)-f_k(x)}+ \abs{f_k(x)-f(x)} \\ & < \nicefrac{\epsilon}{3} + \nicefrac{\epsilon}{3} + \nicefrac{\epsilon}{3} = \epsilon . \end{split} \end{equation*}

Therefore, $$\bigl\{ f(x_m) \bigr\}$$ converges to $$f(x)$$ and $$f$$ is continuous at $$x\text{.}$$ As $$x$$ was arbitrary, $$f$$ is continuous everywhere.

### Subsection6.2.2Integral of the limit

Again, if we simply require pointwise convergence, then the integral of a limit of a sequence of functions need not be equal to the limit of the integrals.

#### Example6.2.3.

Define $$f_n \colon [0,1] \to \R$$ as

\begin{equation*} f_n(x) := \begin{cases} 0 & \text{if } x = 0,\\ n-n^2x & \text{if } 0 < x < \nicefrac{1}{n},\\ 0 & \text{if } x \geq \nicefrac{1}{n}. \end{cases} \end{equation*}

See Figure 6.5. Figure 6.5. Graph of $$f_n(x)\text{.}$$

Each $$f_n$$ is Riemann integrable (it is continuous on $$(0,1]$$ and bounded), and the fundamental theorem of calculus says that

\begin{equation*} \int_0^1 f_n = \int_0^{\nicefrac{1}{n}} (n-n^2x)\,dx = \nicefrac{1}{2} . \end{equation*}

Let us compute the pointwise limit of $$\{ f_n \}\text{.}$$ Fix an $$x \in (0,1]\text{.}$$ For $$n \geq \nicefrac{1}{x}\text{,}$$ we have $$x \geq \nicefrac{1}{n}$$ and so $$f_n(x) = 0\text{.}$$ Therefore,

\begin{equation*} \lim_{n \to \infty} f_n(x) = 0. \end{equation*}

We also have $$f_n(0) = 0$$ for all $$n\text{.}$$ Therefore, the pointwise limit of $$\{ f_n \}$$ is the zero function. Thus

\begin{equation*} \nicefrac{1}{2} = \lim_{n\to\infty} \int_0^1 f_n (x)\,dx \not= \int_0^1 \left( \lim_{n\to\infty} f_n(x)\right)\,dx = \int_0^1 0\,dx = 0 . \end{equation*}

But if we require the convergence to be uniform, the limits can be interchanged. 1

#### Proof.

Let $$\epsilon > 0$$ be given. As $$f_n$$ goes to $$f$$ uniformly, we find an $$M \in \N$$ such that for all $$n \geq M\text{,}$$ we have $$\abs{f_n(x)-f(x)} < \frac{\epsilon}{2(b-a)}$$ for all $$x \in [a,b]\text{.}$$ In particular, by reverse triangle inequality, $$\abs{f(x)} < \frac{\epsilon}{2(b-a)} + \abs{f_n(x)}$$ for all $$x\text{.}$$ Hence $$f$$ is bounded, as $$f_n$$ is bounded. Note that $$f_n$$ is integrable and compute

\begin{equation*} \begin{split} \overline{\int_a^b} f - \underline{\int_a^b} f & = \overline{\int_a^b} \bigl( f(x) - f_n(x) + f_n(x) \bigr)\,dx - \underline{\int_a^b} \bigl( f(x) - f_n(x) + f_n(x) \bigr)\,dx \\ & \leq \overline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx + \overline{\int_a^b} f_n(x) \,dx - \underline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx - \underline{\int_a^b} f_n(x) \,dx \\ & = \overline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx + \int_a^b f_n(x) \,dx - \underline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx - \int_a^b f_n(x) \,dx \\ & = \overline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx - \underline{\int_a^b} \bigl( f(x) - f_n(x) \bigr)\,dx \\ & \leq \frac{\epsilon}{2(b-a)} (b-a) + \frac{\epsilon}{2(b-a)} (b-a) = \epsilon . \end{split} \end{equation*}

The first inequality is Proposition 5.2.5. The second inequality follows from Proposition 5.1.8 and the fact that for all $$x \in [a,b]\text{,}$$ we have $$\frac{-\epsilon}{2(b-a)} < f(x)-f_n(x) < \frac{\epsilon}{2(b-a)}\text{.}$$ As $$\epsilon > 0$$ was arbitrary, $$f$$ is Riemann integrable.

Finally, we compute $$\int_a^b f\text{.}$$ We apply Proposition 5.1.10 in the calculation. Again, for all $$n \geq M$$ (where $$M$$ is the same as above),

\begin{equation*} \begin{split} \abs{\int_a^b f - \int_a^b f_n} & = \abs{ \int_a^b \bigl(f(x) - f_n(x)\bigr)\,dx} \\ & \leq \frac{\epsilon}{2(b-a)} (b-a) = \frac{\epsilon}{2} < \epsilon . \end{split} \end{equation*}

Therefore, $$\{ \int_a^b f_n \}$$ converges to $$\int_a^b f\text{.}$$

#### Example6.2.5.

Suppose we wish to compute

\begin{equation*} \lim_{n\to\infty} \int_0^1 \frac{nx+ \sin(nx^2)}{n} \,dx . \end{equation*}

It is impossible to compute the integrals for any particular $$n$$ using calculus as $$\sin(nx^2)$$ has no closed-form antiderivative. However, we can compute the limit. We have shown before that $$\frac{nx+ \sin(nx^2)}{n}$$ converges uniformly on $$[0,1]$$ to $$x\text{.}$$ By Theorem 6.2.4, the limit exists and

\begin{equation*} \lim_{n\to\infty} \int_0^1 \frac{nx+ \sin(nx^2)}{n} \,dx = \int_0^1 x \,dx = \nicefrac{1}{2} . \end{equation*}

#### Example6.2.6.

If convergence is only pointwise, the limit need not even be Riemann integrable. On $$[0,1]$$ define

\begin{equation*} f_n(x) := \begin{cases} 1 & \text{if } x = \nicefrac{p}{q} \text{ in lowest terms and } q \leq n, \\ 0 & \text{otherwise.} \end{cases} \end{equation*}

The function $$f_n$$ differs from the zero function at finitely many points; there are only finitely many fractions in $$[0,1]$$ with denominator less than or equal to $$n\text{.}$$ So $$f_n$$ is integrable and $$\int_0^1 f_n = \int_0^1 0 = 0\text{.}$$ It is an easy exercise to show that $$\{ f_n \}$$ converges pointwise to the Dirichlet function

\begin{equation*} f(x) := \begin{cases} 1 & \text{if } x \in \Q, \\ 0 & \text{otherwise,} \end{cases} \end{equation*}

which is not Riemann integrable.

#### Example6.2.7.

In fact, if the convergence is only pointwise, the limit of bounded functions is not even necessarily bounded. Define $$f_n \colon [0,1] \to \R$$ by

\begin{equation*} f_n(x) := \begin{cases} 0 & \text{if } x < \nicefrac{1}{n},\\ \nicefrac{1}{x} & \text{else.} \end{cases} \end{equation*}

For every $$n$$ we get that $$\abs{f_n(x)} \leq n$$ for all $$x \in [0,1]$$ so the functions are bounded. However $$f_n$$ converge pointwise to

\begin{equation*} f(x) := \begin{cases} 0 & \text{if } x = 0,\\ \nicefrac{1}{x} & \text{else,} \end{cases} \end{equation*}

which is unbounded.

### Subsection6.2.3Derivative of the limit

While uniform convergence is enough to swap limits with integrals, it is not, however, enough to swap limits with derivatives, unless you also have uniform convergence of the derivatives themselves.

#### Example6.2.8.

Let $$f_n(x) := \frac{\sin(nx)}{n}\text{.}$$ Then $$f_n$$ converges uniformly to 0. See Figure 6.6. The derivative of the limit is 0. But $$f_n'(x) = \cos(nx)\text{,}$$ which does not converge even pointwise, for example $$f_n'(\pi) = {(-1)}^n\text{.}$$ Furthermore, $$f_n'(0) = 1$$ for all $$n\text{,}$$ which does converge, but not to $$0\text{.}$$ Figure 6.6. Graphs of $$\frac{\sin(nx)}{n}$$ for $$n=1,2,\ldots,10\text{,}$$ with higher $$n$$ in lighter gray.

#### Example6.2.9.

Let $$f_n(x) := \frac{1}{1+nx^2}\text{.}$$ If $$x \not= 0\text{,}$$ then $$\lim_{n \to \infty} f_n(x) = 0\text{,}$$ but $$\lim_{n \to \infty} f_n(0) = 1\text{.}$$ Hence, $$\{ f_n \}$$ converges pointwise to a function that is not continuous at $$0\text{.}$$ We compute

\begin{equation*} f_n'(x) = \frac{-2 n x}{(1+ n x^2)^2} . \end{equation*}

For every $$x\text{,}$$ $$\lim_{n\to\infty} f_n'(x) = 0\text{,}$$ so the derivatives converge pointwise to 0, but the reader can check that the convergence is not uniform on any interval containing $$0\text{.}$$ The limit of $$f_n$$ is not differentiable at $$0$$—it is not even continuous at $$0\text{.}$$

See the exercises for more examples. Using the fundamental theorem of calculus, we find an answer for continuously differentiable functions. The following theorem is true even if we do not assume continuity of the derivatives, but the proof is more difficult.

#### Proof.

Define $$f(c) := \lim_{n\to \infty} f_n(c)\text{.}$$ As $$f_n'$$ are continuous and hence Riemann integrable, then via the fundamental theorem of calculus, we find that for $$x \in I\text{,}$$

\begin{equation*} f_n(x) = f_n(c) + \int_c^x f_n' . \end{equation*}

As $$\{ f_n' \}$$ converges uniformly on $$I\text{,}$$ it converges uniformly on $$[c,x]$$ (or $$[x,c]$$ if $$x < c$$). Thus, the limit as $$n \to \infty$$ on the right-hand side exists. Define $$f$$ at the remaining points (where $$x\neq c$$) by this limit:

\begin{equation*} f(x) := \lim_{n\to\infty} f_n(c) + \lim_{n\to\infty} \int_c^x f_n' = f(c) + \int_c^x g . \end{equation*}

The function $$g$$ is continuous, being the uniform limit of continuous functions. Hence $$f$$ is differentiable and $$f'(x) = g(x)$$ for all $$x \in I$$ by the second form of the fundamental theorem.

It remains to prove uniform convergence. Suppose $$I$$ has a lower bound $$a$$ and upper bound $$b\text{.}$$ Let $$\epsilon > 0$$ be given. Take $$M$$ such that for all $$n \geq M\text{,}$$ we have $$\abs{f(c)-f_n(c)} < \nicefrac{\epsilon}{2}$$ and $$\abs{g(x)-f_n'(x)} < \frac{\epsilon}{2(b-a)}$$ for all $$x \in I\text{.}$$ Then

\begin{equation*} \begin{split} \abs{f(x) - f_n(x)} & = \abs{\left(f(c) + \int_c^x g\right) - \left( f_n(c) + \int_c^x f_n' \right)} \\ & \leq \abs{f(c) - f_n(c)} + \abs{\int_c^x g - \int_c^x f_n'} \\ & = \abs{f(c) - f_n(c)} + \abs{\int_c^x \bigl(g(s) - f_n'(s)\bigr) \, ds} \\ & < \frac{\epsilon}{2} + \frac{\epsilon}{2(b-a)} (b-a) =\epsilon. \qedhere \end{split} \end{equation*}

The proof goes through without boundedness of $$I\text{,}$$ except for the uniform convergence of $$f_n$$ to $$f\text{.}$$ As an example suppose $$I = \R$$ and let $$f_n(x) := \nicefrac{x}{n}\text{.}$$ Then $$f_n'(x)=\nicefrac{1}{n}\text{,}$$ which converges uniformly to $$0\text{.}$$ However, $$\{f_n\}$$ converges to 0 only pointwise.

### Subsection6.2.4Convergence of power series

In Section 2.6 we saw that a power series converges absolutely inside its radius of convergence, so it converges pointwise. Let us show that it (and all its derivatives) also converges uniformly. This fact allows us to swap several types of limits. Not only is the limit continuous, we can integrate and even differentiate convergent power series term by term.

#### Proof.

Let $$I := (a-\rho,a+\rho)$$ if $$\rho < \infty\text{,}$$ or let $$I := \R$$ if $$\rho= \infty\text{.}$$ Take $$0 < r < \rho\text{.}$$ The series converges absolutely for every $$x \in I\text{,}$$ in particular if $$x = a+r\text{.}$$ So $$\sum_{n=0}^\infty \abs{c_n} r^n$$ converges. Given $$\epsilon >0\text{,}$$ find $$M$$ such that for all $$k \geq M\text{,}$$

\begin{equation*} \sum_{n=k+1}^\infty \abs{c_n} {r}^n < \epsilon . \end{equation*}

For all $$x \in [a-r,a+r]$$ and all $$m > k\text{,}$$

\begin{multline*} \abs{\sum_{n=0}^m c_n {(x-a)}^n - \sum_{n=0}^k c_n {(x-a)}^n} = \abs{\sum_{n=k+1}^m c_n {(x-a)}^n} \\ \leq \sum_{n=k+1}^m \abs{c_n} {\abs{x-a}}^n \leq \sum_{n=k+1}^m \abs{c_n} {r}^n \leq \sum_{n=k+1}^\infty \abs{c_n} {r}^n <\epsilon. \end{multline*}

The partial sums are therefore uniformly Cauchy on $$[a-r,a+r]$$ and hence converge uniformly on that set.

Moreover, the partial sums are polynomials, which are continuous, and so their uniform limit on $$[a-r,a+r]$$ is a continuous function. As $$r < \rho$$ was arbitrary, the limit function is continuous on all of $$I\text{.}$$

As we said, we will show that power series can be differentiated and integrated term by term. The differentiated or integrated series is again a power series, and we will show it has the same radius of convergence. Therefore, any power series defines an infinitely differentiable function.

We first prove that we can antidifferentiate, as integration only needs uniform limits.

#### Proof.

Take $$0 < r < \rho\text{.}$$ The partial sums $$\sum_{n=0}^k c_n {(x-a)}^n$$ converge uniformly on $$[a-r,a+r]\text{.}$$ For every fixed $$x \in [a-r,a+r]\text{,}$$ the convergence is also uniform on $$[a,x]$$ (or $$[x,a]$$ if $$x < a$$). Hence,

\begin{equation*} \int_a^x f = \int_a^x \lim_{k\to\infty} \sum_{n=0}^k c_n {(s-a)}^n \, ds = \lim_{k\to\infty} \int_a^x \sum_{n=0}^k c_n {(s-a)}^n \, ds = \lim_{k\to\infty} \sum_{n=1}^{k+1} \frac{c_{n-1}}{n} {(x-a)}^{n} . \qedhere \end{equation*}

#### Proof.

Take $$0 < r < \rho\text{.}$$ The series converges uniformly on $$[a-r,a+r]\text{,}$$ but we need uniform convergence of the derivative. Let

\begin{equation*} R := \limsup_{n \to \infty} \abs{c_n}^{1/n} . \end{equation*}

As the series is convergent $$R < \infty\text{,}$$ and the radius of convergence is $$\nicefrac{1}{R}$$ (or $$\infty$$ if $$R=0$$).

Let $$\epsilon > 0$$ be given. In Example 2.2.14, we saw $$\lim\,n^{1/n} = 1\text{.}$$ Hence there exists an $$N$$ such that for all $$n \geq N\text{,}$$ we have $$n^{1/n} < 1+\epsilon\text{.}$$ So

\begin{equation*} R = \limsup_{n \to \infty} \abs{c_n}^{1/n} \leq \limsup_{n \to \infty} \abs{n c_n}^{1/n} \leq (1+\epsilon) \limsup_{n \to \infty} \abs{c_n}^{1/n} = (1+\epsilon)R . \end{equation*}

As $$\epsilon$$ was arbitrary, $$\limsup_{n \to \infty} \abs{n c_n}^{1/n} = R\text{.}$$ Therefore, $$\sum_{n=1}^\infty n c_{n} {(x-a)}^{n}$$ has radius of convergence $$\rho\text{.}$$ By dividing by $$(x-a)\text{,}$$ we find $$\sum_{n=0}^\infty (n+1) c_{n+1} {(x-a)}^{n}$$ has radius of convergence $$\rho$$ as well.

Consequently, the partial sums $$\sum_{n=0}^k (n+1) c_{n+1} {(x-a)}^{n}\text{,}$$ which are derivatives of the partial sums $$\sum_{n=0}^{k+1} c_{n} {(x-a)}^{n}\text{,}$$ converge uniformly on $$[a-r,a+r]\text{.}$$ Furthermore, the series clearly converges at $$x=a\text{.}$$ We may thus apply Theorem 6.2.10, and we are done as $$r < \rho$$ was arbitrary.

#### Example6.2.14.

We could have used this result to define the exponential function. That is, the power series

\begin{equation*} f(x) := \sum_{n=0}^\infty \frac{x^n}{n!} \end{equation*}

has radius of convergence $$\rho=\infty\text{.}$$ Furthermore, $$f(0) = 1\text{,}$$ and by differentiating term by term, we find that $$f'(x) = f(x)\text{.}$$

#### Example6.2.15.

The series

\begin{equation*} \sum_{n=1}^\infty n x^n \end{equation*}

converges to $$\frac{x}{{(1-x)}^2}$$ on $$(-1,1)\text{.}$$

Proof: On $$(-1,1)\text{,}$$ $$\sum_{n=0}^\infty x^n$$ converges to $$\frac{1}{1-x}\text{.}$$ The derivative $$\sum_{n=1}^\infty n x^{n-1}$$ then converges on the same interval to $$\frac{1}{{(1-x)}^2}\text{.}$$ Multiplying by $$x$$ obtains the result.

### Subsection6.2.5Exercises

#### Exercise6.2.1.

Find an explicit example of a sequence of differentiable functions on $$[-1,1]$$ that converge uniformly to a function $$f$$ such that $$f$$ is not differentiable. Hint: There are many possibilities, simplest is perhaps to combine $$\abs{x}$$ and $$\frac{n}{2}x^2 + \frac{1}{2n}\text{,}$$ another is to consider $$\sqrt{x^2+{(\nicefrac{1}{n})}^2}\text{.}$$ Show that these functions are differentiable, converge uniformly, and then show that the limit is not differentiable.

#### Exercise6.2.2.

Let $$f_n(x) := \frac{x^n}{n}\text{.}$$ Show that $$\{ f_n \}$$ converges uniformly to a differentiable function $$f$$ on $$[0,1]$$ (find $$f$$). However, show that $$f'(1) \not= \lim\limits_{n\to\infty} f_n'(1)\text{.}$$

#### Exercise6.2.3.

Let $$f \colon [0,1] \to \R$$ be a Riemann integrable (hence bounded) function. Find $$\displaystyle \lim_{n\to\infty} \int_0^1 \frac{f(x)}{n} \,dx\text{.}$$

#### Exercise6.2.4.

Show $$\displaystyle \lim_{n\to\infty} \int_1^2 e^{-nx^2} \,dx = 0\text{.}$$ Feel free to use what you know about the exponential function from calculus.

#### Exercise6.2.5.

Find an example of a sequence of continuous functions on $$(0,1)$$ that converges pointwise to a continuous function on $$(0,1)\text{,}$$ but the convergence is not uniform.

Note: In the previous exercise, $$(0,1)$$ was picked for simplicity. For a more challenging exercise, replace $$(0,1)$$ with $$[0,1]\text{.}$$

#### Exercise6.2.6.

True/False; prove or find a counterexample to the following statement: If $$\{ f_n \}$$ is a sequence of everywhere discontinuous functions on $$[0,1]$$ that converge uniformly to a function $$f\text{,}$$ then $$f$$ is everywhere discontinuous.

#### Exercise6.2.7.

For a continuously differentiable function $$f \colon [a,b] \to \R\text{,}$$ define

\begin{equation*} \norm{f}_{C^1} := \norm{f}_u + \norm{f'}_u . \end{equation*}

Suppose $$\{ f_n \}$$ is a sequence of continuously differentiable functions such that for every $$\epsilon >0\text{,}$$ there exists an $$M$$ such that for all $$n,k \geq M\text{,}$$ we have

\begin{equation*} \norm{f_n-f_k}_{C^1} < \epsilon . \end{equation*}

Show that $$\{ f_n \}$$ converges uniformly to some continuously differentiable function $$f \colon [a,b] \to \R\text{.}$$

Suppose $$f \colon [0,1] \to \R$$ is Riemann integrable. For the following two exercises define the number

\begin{equation*} \norm{f}_{L^1} := \int_0^1 \abs{f(x)}\,dx . \end{equation*}

It is true that $$\abs{f}$$ is integrable whenever $$f$$ is, see Exercise 5.2.15. The number is called the $$L^1$$-norm and defines another very common type of convergence called the $$L^1$$-convergence. It is, however, a bit more subtle.

#### Exercise6.2.8.

Suppose $$\{ f_n \}$$ is a sequence of Riemann integrable functions on $$[0,1]$$ that converges uniformly to $$0\text{.}$$ Show that

\begin{equation*} \lim_{n\to\infty} \norm{f_n}_{L^1} = 0 . \end{equation*}

#### Exercise6.2.9.

Find a sequence $$\{ f_n \}$$ of Riemann integrable functions on $$[0,1]$$ converging pointwise to $$0\text{,}$$ but

\begin{equation*} \lim_{n\to\infty} \norm{f_n}_{L^1} \text{ does not exist (is } \infty\text{)}. \end{equation*}

#### Exercise6.2.10.

(Hard)   Prove Dini's theorem: Let $$f_n \colon [a,b] \to \R$$ be a sequence of continuous functions such that

\begin{equation*} 0 \leq f_{n+1}(x) \leq f_n(x) \leq \cdots \leq f_1(x) \qquad \text{for all } n \in \N. \end{equation*}

Suppose $$\{ f_n \}$$ converges pointwise to $$0\text{.}$$ Show that $$\{ f_n \}$$ converges to zero uniformly.

#### Exercise6.2.11.

Suppose $$f_n \colon [a,b] \to \R$$ is a sequence of continuous functions that converges pointwise to a continuous $$f \colon [a,b] \to \R\text{.}$$ Suppose that for every $$x \in [a,b]\text{,}$$ the sequence $$\{ \abs{f_n(x)-f(x)} \}$$ is monotone. Show that the sequence $$\{f_n\}$$ converges uniformly.

#### Exercise6.2.12.

Find a sequence of Riemann integrable functions $$f_n \colon [0,1] \to \R$$ such that $$\{ f_n \}$$ converges to zero pointwise, and such that

1. $$\bigl\{ \int_0^1 f_n \bigr\}_{n=1}^\infty$$ increases without bound,

2. $$\bigl\{ \int_0^1 f_n \bigr\}_{n=1}^\infty$$ is the sequence $$-1,1,-1,1,-1,1, \ldots\text{.}$$

It is possible to define a joint limit of a double sequence $$\{ x_{n,m} \}$$ of real numbers (that is a function from $$\N \times \N$$ to $$\R$$). We say $$L$$ is the joint limit of $$\{ x_{n,m} \}$$ and write

\begin{equation*} \lim_{\substack{n\to\infty\\m\to\infty}} x_{n,m} = L , \qquad \text{or} \qquad \lim_{(n,m) \to \infty} x_{n,m} = L , \end{equation*}

if for every $$\epsilon > 0\text{,}$$ there exists an $$M$$ such that if $$n \geq M$$ and $$m \geq M\text{,}$$ then $$\abs{x_{n,m} - L} < \epsilon\text{.}$$

#### Exercise6.2.13.

Suppose the joint limit (see above) of $$\{ x_{n,m} \}$$ is $$L\text{,}$$ and suppose that for all $$n\text{,}$$ $$\lim\limits_{m \to \infty} x_{n,m}$$ exists, and for all $$m\text{,}$$ $$\lim\limits_{n \to \infty} x_{n,m}$$ exists. Then show $$\lim\limits_{n\to\infty}\lim\limits_{m \to \infty} x_{n,m} = \lim\limits_{m\to\infty}\lim\limits_{n \to \infty} x_{n,m} = L\text{.}$$

#### Exercise6.2.14.

A joint limit (see above) does not mean the iterated limits exist. Consider $$x_{n,m} := \frac{{(-1)}^{n+m}}{\min \{n,m \}}\text{.}$$

1. Show that for no $$n$$ does $$\lim\limits_{m \to \infty} x_{n,m}$$ exist, and for no $$m$$ does $$\lim\limits_{n \to \infty} x_{n,m}$$ exist. So neither $$\lim\limits_{n\to\infty}\lim\limits_{m \to \infty} x_{n,m}$$ nor $$\lim\limits_{m\to\infty}\lim\limits_{n \to \infty} x_{n,m}$$ makes any sense at all.

2. Show that the joint limit of $$\{ x_{n,m} \}$$ exists and equals 0.

#### Exercise6.2.15.

We say that a sequence of functions $$f_n \colon \R \to \R$$ converges uniformly on compact subsets if for every $$k \in \N\text{,}$$ the sequence $$\{ f_n \}$$ converges uniformly on $$[-k,k]\text{.}$$

1. Prove that if $$f_n \colon \R \to \R$$ is a sequence of continuous functions converging uniformly on compact subsets, then the limit is continuous.

2. Prove that if $$f_n \colon \R \to \R$$ is a sequence of functions Riemann integrable on every closed and bounded interval $$[a,b]\text{,}$$ and converging uniformly on compact subsets to an $$f \colon \R \to \R\text{,}$$ then for every interval $$[a,b]\text{,}$$ we have $$f \in \sR[a,b]\text{,}$$ and $$\int_a^b f = \lim_{n\to\infty} \int_a^b f_n\text{.}$$

#### Exercise6.2.16.

(Challenging)   Find a sequence of continuous functions $$f_n \colon [0,1] \to \R$$ that converge to the popcorn function $$f \colon [0,1] \to \R\text{,}$$ that is the function such that $$f(\nicefrac{p}{q}) := \nicefrac{1}{q}$$ (if $$\nicefrac{p}{q}$$ is in lowest terms) and $$f(x) := 0$$ if $$x$$ is not rational (note that $$f(0) = f(1) = 1$$), see Example 3.2.12. So a pointwise limit of continuous functions can have a dense set of discontinuities. See also the next exercise.

#### Exercise6.2.17.

(Challenging)   The Dirichlet function $$f \colon [0,1] \to \R\text{,}$$ that is the function such that $$f(x) := 1$$ if $$x \in \Q$$ and $$f(x) := 0$$ if $$x \notin \Q\text{,}$$ is not the pointwise limit of continuous functions, although this is difficult to show. Prove, however, that $$f$$ is a pointwise limit of functions that are themselves pointwise limits of continuous functions themselves.

#### Exercise6.2.18.

1. Find a sequence of Lipschitz continuous functions on $$[0,1]$$ whose uniform limit is $$\sqrt{x}\text{,}$$ which is a non-Lipschitz function.

2. On the other hand, show that if $$f_n \colon S \to \R$$ are Lipschitz with a uniform constant $$K$$ (meaning all of them satisfy the definition with the same constant) and $$\{ f_n \}$$ converges pointwise to $$f \colon S \to \R\text{,}$$ then the limit $$f$$ is a Lipschitz continuous function with Lipschitz constant $$K\text{.}$$

#### Exercise6.2.19.

(requires Section 2.6)   If $$\sum_{n=0}^\infty c_n {(x-a)}^n$$ has radius of convergence $$\rho\text{,}$$ show that the term by term integral $$\sum_{n=1}^\infty \frac{c_{n-1}}{n} {(x-a)}^n$$ has radius of convergence $$\rho\text{.}$$ Note that we only proved above that the radius of convergence was at least $$\rho\text{.}$$

#### Exercise6.2.20.

(requires Section 2.6 and Section 4.3)   Suppose $$f(x) := \sum_{n=0}^\infty c_n {(x-a)}^n$$ converges in $$(a-\rho,a+\rho)\text{.}$$

1. Suppose that $$f^{(k)}(a) = 0$$ for all $$k=0,1,2,3,\ldots\text{.}$$ Prove that $$c_n = 0$$ for all $$n\text{,}$$ or in other words, $$f(x) = 0$$ for all $$x \in (a-\rho,a+\rho)\text{.}$$

2. Using part a) prove a version of the so-called “identity theorem for analytic functions”: If there exists an $$\epsilon > 0$$ such that $$f(x) = 0$$ for all $$x \in (a-\epsilon, a+\epsilon)\text{,}$$ then $$f(x) = 0$$ for all $$x \in (a-\rho,a+\rho)\text{.}$$

#### Exercise6.2.21.

Let $$f_n(x) := \frac{x}{1+{(nx)}^2}\text{.}$$ Notice that $$f_n$$ are differentiable functions.

1. Show that $$\{ f_n \}$$ converges uniformly to 0.

2. Show that $$\sabs{f_n'(x)} \leq 1$$ for all $$x$$ and all $$n\text{.}$$

3. Show that $$\{ f_n' \}$$ converges pointwise to a function discontinuous at the origin.

4. Let $$\{ a_n \}$$ be an enumeration of the rational numbers. Define

\begin{equation*} g_n(x) := \sum_{k=1}^n 2^{-k} f_n(x-a_k) . \end{equation*}

Show that $$\{ g_n \}$$ converges uniformly to 0.

5. Show that $$\{ g_n' \}$$ converges pointwise to a function $$\psi$$ that is discontinuous at every rational number and continuous at every irrational number. In particular, $$\lim_{n\to\infty} g_n'(x) \not= 0$$ for every rational number $$x\text{.}$$

Weaker conditions are sufficient for this kind of theorem, but to prove such a generalization requires more sophisticated machinery than we cover here: the Lebesgue integral. In particular, the theorem holds with pointwise convergence as long as $$f$$ is integrable and there is an $$M$$ such that $$\snorm{f_n}_u \leq M$$ for all $$n\text{.}$$
For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf