## Section11.2Swapping limits

Note: 2 lectures

### Subsection11.2.1Continuity

Let us get back to swapping limits and expand on Chapter 6. Let $$\{ f_n \}$$ be a sequence of functions $$f_n \colon X \to Y$$ for a set $$X$$ and a metric space $$Y\text{.}$$ Let $$f \colon X \to Y$$ be a function and for every $$x \in X$$ suppose that

\begin{equation*} f(x) = \lim_{n\to \infty} f_n(x) . \end{equation*}

We say the sequence $$\{ f_n \}$$ converges pointwise to $$f\text{.}$$

For $$Y=\C\text{,}$$ a series converges pointwise if for every $$x \in X\text{,}$$ we have

\begin{equation*} f(x) = \lim_{n\to \infty} \sum_{k=1}^n f_k(x) = \sum_{k=1}^\infty f_k(x) . \end{equation*}

The question is: If $$f_n$$ are all continuous, is $$f$$ continuous? Differentiable? Integrable? What are the derivatives or integrals of $$f\text{?}$$

For example, for continuity of the pointwise limit of a sequence $$\{ f_n \}\text{,}$$ we are asking if

\begin{equation*} \lim_{x\to x_0} \lim_{n\to\infty} f_n(x) \overset{?}{=} \lim_{n\to\infty} \lim_{x\to x_0} f_n(x) . \end{equation*}

We don't even a priory know if both sides exist, let alone if they are equal each other.

#### Example11.2.1.

The functions $$f_n \colon \R \to \R\text{,}$$

\begin{equation*} f_n(x) := \frac{1}{1+nx^2}, \end{equation*}

are continuous and converge pointwise to the discontinuous function

\begin{equation*} f(x) := \begin{cases} 1 & \text{if } x=0, \\ 0 & \text{else.} \end{cases} \end{equation*}

So pointwise convergence is not enough to preserve continuity (nor even boundedness). For that, we need uniform convergence.

Let $$f_n \colon X \to Y$$ be functions. Then $$\{f_n\}$$ converges uniformly to $$f$$ if for every $$\epsilon > 0\text{,}$$ there exists an $$M$$ such that for all $$n \geq M$$ and all $$x \in X\text{,}$$ we have

\begin{equation*} d\bigl(f_n(x),f(x)\bigr) < \epsilon . \end{equation*}

A series $$\sum f_n$$ of complex-valued functions converges uniformly if the sequence of partial sums converges uniformly, that is for every $$\epsilon > 0$$ there exists an $$M$$ such that for all $$n \geq M$$ and all $$x \in X$$

\begin{equation*} \abs{\left(\sum_{k=1}^n f_k(x)\right)-f(x)} < \epsilon . \end{equation*}

The simplest property preserved by uniform convergence is boundedness. We leave the proof of the following proposition as an exercise. It is almost identical to the proof for real-valued functions.

If $$X$$ is a set and $$(Y,d)$$ is a metric space, then a sequence $$f_n \colon X \to Y$$ is uniformly Cauchy if for every $$\epsilon > 0\text{,}$$ there is an $$M$$ such that for all $$n, m \geq M$$ and all $$x \in X\text{,}$$ we have

\begin{equation*} d\bigl(f_n(x),f_m(x)\bigr) < \epsilon . \end{equation*}

The notion is the same as for real-valued functions. The proof of the following proposition is again essentially the same as in that setting and is left as an exercise.

For $$f \colon X \to \C\text{,}$$ we write

\begin{equation*} \snorm{f}_u := \sup_{x \in X} \sabs{f(x)} . \end{equation*}

We call $$\snorm{\cdot}_u$$ the supremum norm or uniform norm. Then a sequence of functions $$f_n \colon X \to \C$$ converges uniformly to $$f \colon X \to \C$$ if and only if

\begin{equation*} \lim_{n\to \infty} \snorm{f_n-f}_u = 0 . \end{equation*}

The supremum norm satisfies the triangle inequality: For every $$x \in X\text{,}$$

\begin{equation*} \sabs{f(x)+g(x)} \leq \sabs{f(x)}+\sabs{g(x)} \leq \snorm{f}_u+\snorm{g}_u . \end{equation*}

Take a supremum on the left to get

\begin{equation*} \snorm{f+g}_u \leq \snorm{f}_u+\snorm{g}_u . \end{equation*}

For a compact metric space $$X\text{,}$$ the uniform norm is a norm on the vector space $$C(X,\C)\text{.}$$ We leave it as an exercise. While we will not need it, $$C(X,\C)$$ is in fact a complex vector space, that is, in the definition of a vector space we can replace $$\R$$ with $$\C\text{.}$$ Convergence in the metric space $$C(X,\C)$$ is uniform convergence.

We will study a couple of types of series of functions, and a useful test for uniform convergence of a series is the so-called Weierstrass $$M$$-test.

Another way to state the theorem is to say that if $$\sum \snorm{f_n}_u$$ converges, then $$\sum f_n$$ converges uniformly. Note that the converse of this theorem is not true. Also note that applying the theorem to $$\sum \sabs{f_n(x)}$$ gives that a series satisfying the $$M$$-test also converges uniformly, so the series converges both absolutely and uniformly.

#### Proof.

Suppose $$\sum M_n$$ converges. Given $$\epsilon > 0\text{,}$$ we have that the partial sums of $$\sum M_n$$ are Cauchy so there is an $$N$$ such that for all $$m, n \geq N$$ with $$m \geq n\text{,}$$ we have

\begin{equation*} \sum_{k=n+1}^m M_k < \epsilon . \end{equation*}

We estimate a Cauchy difference of the partial sums of the functions

\begin{equation*} \abs{\sum_{k=n+1}^m f_k(x)} \leq \sum_{k=n+1}^m \sabs{f_k(x)} \leq \sum_{k=n+1}^m M_k < \epsilon . \end{equation*}

We are done by Proposition 11.1.4.

#### Example11.2.5.

The series

\begin{equation*} \sum_{n=1}^\infty \frac{\sin(nx)}{n^2} \end{equation*}

converges uniformly on $$\R\text{.}$$ See Figure 11.2. This is a Fourier series, we will see more of these in a later section. Proof: The series converges uniformly because $$\sum_{n=1}^\infty \frac{1}{n^2}$$ converges and

\begin{equation*} \abs{\frac{\sin(nx)}{n^2}} \leq \frac{1}{n^2} . \end{equation*}

#### Example11.2.6.

The series

\begin{equation*} \sum_{n=0}^\infty \frac{x^n}{n!} \end{equation*}

converges uniformly on every bounded interval. This series is a power series that we will study shortly. Proof: Take the interval $$[-r,r] \subset \R$$ (every bounded interval is contained in some $$[-r,r]$$). The series $$\sum_{n=0}^\infty \frac{r^n}{n!}$$ converges by the ratio test, so $$\sum_{n=0}^\infty \frac{x^n}{n!}$$ converges uniformly on $$[-r,r]$$ as

\begin{equation*} \abs{\frac{x^n}{n!} } \leq \frac{r^n}{n!} . \end{equation*}

Now we would love to say something about the limit. For example, is it continuous?

In other words,

\begin{equation*} \lim_{k \to \infty} \lim_{n\to\infty} f_n(x_k) = \lim_{n \to \infty} \lim_{k\to\infty} f_n(x_k) . \end{equation*}

#### Proof.

First we show that $$\{ a_n \}$$ converges. As $$\{ f_n \}$$ converges uniformly it is uniformly Cauchy. Let $$\epsilon > 0$$ be given. There is an $$M$$ such that for all $$m,n \geq M\text{,}$$ we have

\begin{equation*} d_Y\bigl(f_n(x_k),f_m(x_k)\bigr) < \epsilon \qquad \text{for all } k . \end{equation*}

Note that $$d_Y(a_n,a_m) \leq d_Y\bigl(a_n,f_n(x_k)\bigr) + d_Y\bigl(f_n(x_k),f_m(x_k)\bigr) + d_Y\bigl(f_m(x_k),a_m\bigr)$$ and take the limit as $$k \to \infty$$ to find

\begin{equation*} d_Y(a_n,a_m) \leq \epsilon . \end{equation*}

Hence $$\{a_n\}$$ is Cauchy and converges since $$Y$$ is complete. Write $$a := \lim \, a_n\text{.}$$

Find a $$k \in \N$$ such that

\begin{equation*} d_Y\bigl(f_k(p),f(p)\bigr) < \nicefrac{\epsilon}{3} \end{equation*}

for all $$p \in X\text{.}$$ Assume $$k$$ is large enough so that

\begin{equation*} d_Y(a_k,a) < \nicefrac{\epsilon}{3} . \end{equation*}

Find an $$N \in \N$$ such that for $$m \geq N\text{,}$$

\begin{equation*} d_Y\bigl(f_k(x_m),a_k\bigr) < \nicefrac{\epsilon}{3} . \end{equation*}

Then for $$m \geq N\text{,}$$

\begin{equation*} d_Y\bigl(f(x_m),a\bigr) \leq d_Y\bigl(f(x_m),f_k(x_m)\bigr) + d_Y\bigl(f_k(x_m),a_k\bigr) + d_Y\bigl(a_k,a\bigr) < \nicefrac{\epsilon}{3} + \nicefrac{\epsilon}{3} + \nicefrac{\epsilon}{3} = \epsilon . \qedhere \end{equation*}

We obtain an immediate corollary about continuity.

The converse is not true. Just because the limit is continuous doesn't mean that the convergence is uniform. For example: $$f_n \colon (0,1) \to \R$$ defined by $$f_n(x) := x^n$$ converge to the zero function, but not uniformly. However, if we add extra conditions on the sequence, we can obtain a partial converse such as Dini's theorem, see Exercise 6.2.10.

In Exercise 11.2.3 the reader is asked to prove that for a compact $$X\text{,}$$ $$C(X,\C)$$ is a normed vector space with the uniform norm, and hence a metric space. We have just shown that $$C(X,\C)$$ is Cauchy-complete: Proposition 11.2.3 says that a Cauchy sequence in $$C(X,\C)$$ converges uniformly to some function, and Corollary 11.2.8 shows that the limit is continuous and hence in $$C(X,\C)\text{.}$$

#### Example11.2.10.

By Example 11.2.5 the Fourier series

\begin{equation*} \sum_{n=1}^\infty \frac{\sin(nx)}{n^2} \end{equation*}

converges uniformly and hence is continuous by Corollary 11.2.8 (as is visible in Figure 11.2).

### Subsection11.2.2Integration

Since the integral of a complex-valued function is just the integral of the real and imaginary parts separately, the proof follows directly by the results of Chapter 6. We leave the details as an exercise.

#### Example11.2.13.

Let us show how to integrate a Fourier series.

\begin{equation*} \int_{0}^x \sum_{n=1}^\infty \frac{\cos(nt)}{n^2} \,dt = \sum_{n=1}^\infty \int_{0}^x \frac{\cos(nt)}{n^2}\,dt = \sum_{n=1}^\infty \frac{\sin(nx)}{n^3} \end{equation*}

The swapping of integral and sum is possible because of uniform convergence, which we have proved before using the Weierstrass $$M$$-test (Theorem 11.2.4).

We remark that we can swap integrals and limits under far less stringent hypotheses, but for that we would need a stronger integral than the Riemann integral. E.g. the Lebesgue integral.

### Subsection11.2.3Differentiation

Recall that a complex-valued function $$f \colon [a,b] \to \C\text{,}$$ where $$f(x) = u(x)+i\,v(x)\text{,}$$ is differentiable, if $$u$$ and $$v$$ are differentiable and the derivative is

\begin{equation*} f'(x) = u'(x)+i\,v'(x) . \end{equation*}

The proof of the following theorem is to apply the corresponding theorem for real functions to $$u$$ and $$v\text{,}$$ and is left as an exercise.

Uniform limits of the functions themselves are not enough, and can make matters even worse. In Section 11.7 we will prove that continuous functions are uniform limits of polynomials, yet as the following example demonstrates, a continuous function need not be differentiable anywhere.

#### Example11.2.15.

There exist continuous nowhere differentiable functions. Such functions are often called Weierstrass functions, although this particular one, essentially due to Takagi 1 , is a different example than what Weierstrass gave.

Define

\begin{equation*} \varphi(x) :=\sabs{x} \qquad \text{for } x \in [-1,1] . \end{equation*}

Extend the definition of $$\varphi$$ to all of $$\R$$ by making it 2-periodic: Decree that $$\varphi(x) = \varphi(x+2)\text{.}$$ The function $$\varphi \colon \R \to \R$$ is continuous, in fact $$\sabs{\varphi(x)-\varphi(y)} \leq \sabs{x-y}$$ (why?). See Figure 11.3.

As $$\sum {\left(\frac{3}{4}\right)}^n$$ converges and $$\sabs{\varphi(x)} \leq 1$$ for all $$x\text{,}$$ we have by the $$M$$-test (Theorem 11.2.4) that

\begin{equation*} f(x) := \sum_{n=0}^\infty {\left(\frac{3}{4}\right)}^n \varphi(4^n x) \end{equation*}

converges uniformly and hence is continuous. See Figure 11.4.

We claim $$f \colon \R \to \R$$ is nowhere differentiable. Fix $$x\text{,}$$ and we will show $$f$$ is not differentiable at $$x\text{.}$$ Define

\begin{equation*} \delta_m := \pm \frac{1}{2} 4^{-m} , \end{equation*}

where the sign is chosen so that there is no integer between $$4^m x$$ and $$4^m(x+\delta_m) = 4^m x \pm \frac{1}{2}\text{.}$$

We want to look at the difference quotient

\begin{equation*} \frac{f(x+\delta_m)-f(x)}{\delta_m} = \sum_{n=0}^\infty {\left(\frac{3}{4}\right)}^n \frac{\varphi\bigl(4^n(x+\delta_m)\bigr)-\varphi(4^nx)}{\delta_m} . \end{equation*}

Fix $$m$$ for a moment. Consider the expression inside the series:

\begin{equation*} \gamma_{n} := \frac{\varphi\bigl(4^n(x+\delta_m)\bigr)-\varphi(4^nx)}{\delta_m} . \end{equation*}

If $$n > m\text{,}$$ then $$4^n\delta_m$$ is an even integer. As $$\varphi$$ is 2-periodic we get that $$\gamma_n = 0\text{.}$$

As there is no integer between $$4^m(x+\delta_m) = 4^m x\pm\nicefrac{1}{2}$$ and $$4^m x\text{,}$$ then on this interval $$\varphi(t) = \pm t + \ell$$ for some integer $$\ell\text{.}$$ In particular, $$\abs{\varphi\bigl(4^m(x+ \delta_m)\bigr)-\varphi(4^mx)} = \abs{4^mx\pm\nicefrac{1}{2}-4^mx} = \nicefrac{1}{2}\text{.}$$ Therefore,

\begin{equation*} \sabs{\gamma_m} = \abs{ \frac{\varphi\bigl(4^m(x+\delta_m)\bigr)-\varphi(4^mx)}{\pm (\nicefrac{1}{2}) 4^{-m}} } = 4^m . \end{equation*}

Similarly, suppose $$n < m\text{.}$$ Since $$\sabs{\varphi(s) -\varphi(t)} \leq \sabs{s-t}\text{,}$$

\begin{equation*} \sabs{\gamma_n} = \abs{\frac{\varphi\bigl(4^nx\pm(\nicefrac{1}{2})4^{n-m}\bigr)-\varphi(4^nx)}{\pm (\nicefrac{1}{2}) 4^{-m}}} \leq \abs{\frac{\pm(\nicefrac{1}{2})4^{n-m}}{\pm (\nicefrac{1}{2}) 4^{-m}}} = 4^n . \end{equation*}

And so

\begin{equation*} \begin{split} \abs{ \frac{f(x+\delta_m)-f(x)}{\delta_m} } = \abs{ \sum_{n=0}^\infty {\left(\frac{3}{4}\right)}^n \gamma_n } & = \abs{ \sum_{n=0}^m {\left(\frac{3}{4}\right)}^n \gamma_n } \\ & \geq \abs{ {\left(\frac{3}{4}\right)}^m \gamma_m} - \abs{ \sum_{n=0}^{m-1} {\left(\frac{3}{4}\right)}^n \gamma_n } \\ & \geq 3^m - \sum_{n=0}^{m-1} 3^n = 3^m - \frac{3^{m}-1}{3-1} = \frac{3^m +1}{2} . \end{split} \end{equation*}

As $$m \to \infty\text{,}$$ we have $$\delta_m \to 0\text{,}$$ but $$\frac{3^m+1}{2}$$ goes to infinity. Hence $$f$$ cannot be differentiable at $$x\text{.}$$

### Subsection11.2.4Exercises

#### Exercise11.2.3.

Suppose $$(X,d)$$ is a compact metric space. Prove that $$\snorm{\cdot}_u$$ is a norm on the vector space of continuous complex-valued functions $$C(X,\C)\text{.}$$

#### Exercise11.2.4.

1. Prove that $$f_n(x) := 2^{-n} \sin(2^n x)$$ converge uniformly to zero, but there exists a dense set $$D \subset \R$$ such that $$\lim_{n\to\infty} f_n'(x) = 1$$ for all $$x \in D\text{.}$$

2. Prove that $$\sum_{n=1}^\infty 2^{-n} \sin(2^n x)$$ converges uniformly to a continuous function, and there exists a dense set $$D \subset \R$$ where the derivatives of the partial sums do not converge.

#### Exercise11.2.5.

Suppose $$(X,d)$$ is a compact metric space. Prove that $$\snorm{f}_{C^1} := \snorm{f}_u+\snorm{f'}_u$$ is a norm on the vector space of continuously differentiable complex-valued functions $$C^1(X,\C)\text{.}$$

#### Exercise11.2.8.

Work through the following counterexample to the converse of the Weierstrass $$M$$-test (Theorem 11.2.4). Define $$f_n \colon [0,1] \to \R$$ by

\begin{equation*} f_n(x) := \begin{cases} \frac{1}{n} & \text{if } \frac{1}{n+1} < x < \frac{1}{n},\\ 0 & \text{else.} \end{cases} \end{equation*}

Prove that $$\sum f_n$$ converges uniformly, but $$\sum \snorm{f_n}_u$$ does not converge.

#### Exercise11.2.9.

Suppose $$f_n \colon [0,1] \to \R$$ are monotone increasing functions and suppose that $$\sum f_n$$ converges pointwise. Prove that $$\sum f_n$$ converges uniformly.

#### Exercise11.2.10.

Prove that

\begin{equation*} \sum_{n=1}^\infty e^{-nx} \end{equation*}

converges for all $$x > 0$$ to a differentiable function.

Teiji Takagi 2  (1875–1960) was a Japanese mathematician.
https://en.wikipedia.org/wiki/Teiji_Takagi
For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf