# Basic Analysis I & II: Introduction to Real Analysis, Volumes I & II

## Section11.3Power series and analytic functions

Note: 2–3 lectures

### Subsection11.3.1Analytic functions

A (complex) power series is a series of the form
\begin{equation*} \sum_{n=0}^\infty c_n {(z-a)}^n \end{equation*}
for $$c_n, z, a \in \C\text{.}$$ We say the series converges if the series converges for some $$z \not= a\text{.}$$
Let $$U \subset \C$$ be an open set and $$f \colon U \to \C$$ a function. Suppose that for every $$a \in U$$ there exists a $$\rho > 0$$ and a power series convergent to the function
\begin{equation*} f(z) = \sum_{n=0}^\infty c_n {(z-a)}^n \end{equation*}
for all $$z \in B(a,\rho)\text{.}$$ Then we say $$f$$ is an analytic function. Similarly, given an interval $$(a,b) \subset \R\text{,}$$ we say that $$f \colon (a,b) \to \C$$ is analytic or perhaps real-analytic if for each point $$c \in (a,b)$$ there is a power series around $$c$$ that converges in some $$(c-\rho,c+\rho)$$ for some $$\rho > 0\text{.}$$ As we will sometimes talk about real and sometimes about complex power series, we will use $$z$$ to denote a complex number and $$x$$ a real number. We will always mention which case we are working with.
An analytic function has different expansions around different points. Moreover, convergence does not automatically happen on the entire domain of the function. For example, if $$\sabs{z} < 1\text{,}$$ then
\begin{equation*} \frac{1}{1-z} = \sum_{n=0}^\infty z^n . \end{equation*}
While the left-hand side exists on all of $$z \not= 1\text{,}$$ the right-hand side happens to converge only if $$\sabs{z} < 1\text{.}$$ See a graph of a small piece of $$\frac{1}{1-z}$$ in Figure 11.5. We cannot graph the function itself, we can only graph its real or imaginary parts for lack of dimensions in our universe.

### Subsection11.3.2Convergence of power series

We proved several results for power series of a real variable in Section 2.6. For the most part the convergence properties of power series deal with the series $$\sum_{k=0}^\infty \sabs{c_k} \, \sabs{z-a}^k$$ and so we have already proved many results about complex power series. In particular, we computed the so-called radius of convergence of a power series.
The number $$\rho$$ is the radius of convergence. See Figure 11.6. The radius of convergence gives a disc around $$a$$ where the series converges. A power series is convergent if $$\rho > 0\text{.}$$

#### Proof.

We use the real version of this proposition, Proposition 2.6.10. Let
\begin{equation*} R \coloneqq \limsup_{n\to\infty} \sqrt[n]{\sabs{c_n}} . \end{equation*}
If $$R = 0\text{,}$$ then $$\sum_{n=0}^\infty \sabs{c_n} \, \sabs{z-a}^n$$ converges for all $$z\text{.}$$ If $$R = \infty\text{,}$$ then $$\sum_{n=0}^\infty \sabs{c_n} \, \sabs{z-a}^n$$ converges only at $$z=a\text{.}$$ Otherwise, let $$\rho \coloneqq \nicefrac{1}{R}$$ and $$\sum_{n=0}^\infty \sabs{c_n} \, \sabs{z-a}^n$$ converges when $$\sabs{z-a} < \rho\text{,}$$ and diverges (in fact the terms of the series do not go to zero) when $$\sabs{z-a} > \rho\text{.}$$
To prove the Furthermore,’’ suppose $$0 < r < \rho$$ and $$z \in C(a,r)\text{.}$$ Then consider the partial sums
\begin{equation*} \abs{\sum_{n=0}^k c_n {(z-a)}^n} \leq \sum_{n=0}^k \sabs{c_n} \sabs{z-a}^n \leq \sum_{n=0}^k \sabs{c_n} r^n . \qedhere \end{equation*}
If $$\sum_{n=0}^\infty c_n {(z-a)}^n$$ converges for some $$z\text{,}$$ then
\begin{equation*} \sum_{n=0}^\infty c_n {(w-a)}^n \end{equation*}
converges absolutely whenever $$\sabs{w-a} < \sabs{z-a}\text{.}$$ Conversely, if the series diverges at $$z\text{,}$$ then it must diverge at $$w$$ whenever $$\sabs{w-a} > \sabs{z-a}\text{.}$$ Hence, to show that the radius of convergence is at least some number, we simply need to show convergence at some point by any method we know.

#### Example11.3.2.

We list some series we already know:
\begin{equation*} \begin{aligned} & & & \sum_{n=0}^\infty z^n & & \text{has radius of convergence } 1. & & \\ & & & \sum_{n=0}^\infty \frac{1}{n!} z^n & & \text{has radius of convergence } \infty. & & \\ & & & \sum_{n=0}^\infty n^n z^n & & \text{has radius of convergence } 0. & & \end{aligned} \end{equation*}

#### Example11.3.3.

Note the difference between $$\frac{1}{1-z}$$ and its power series. Let us expand $$\frac{1}{1-z}$$ as power series around a point $$a \not= 1\text{.}$$ Let $$c \coloneqq \frac{1}{1-a}\text{,}$$ then
\begin{equation*} \frac{1}{1-z} = \frac{c}{1-c(z-a)} = c \sum_{n=0}^\infty c^{n} {(z-a)}^n = \sum_{n=0}^\infty \left( \frac{1}{{(1-a)}^{n+1}} \right) {(z-a)}^n . \end{equation*}
The series $$\sum_{n=0}^\infty c^n {(z-a)}^n$$ converges if and only if the series on the right-hand side converges and
\begin{equation*} \limsup_{n\to\infty} \sqrt[n]{\sabs{c^n}} = \sabs{c} = \frac{1}{\sabs{1-a}} . \end{equation*}
The radius of convergence of the power series is $$\sabs{1-a}\text{,}$$ that is the distance from $$1$$ to $$a\text{.}$$ The function $$\frac{1}{1-z}$$ has a power series representation around every $$a\not= 1$$ and so is analytic in $$\C \setminus \{ 1 \}\text{.}$$ The domain of the function is bigger than the region of convergence of the power series representing the function at any point.
It turns out that if a function has a power series representation converging to the function on some ball, then it has a power series representation at every point in the ball. We will prove this result later.

### Subsection11.3.3Properties of analytic functions

#### Proof.

For $$z_0 \in B(a,\rho)\text{,}$$ pick $$r < \rho$$ such that $$z_0 \in B(a,r)\text{.}$$ On $$B(a,r)$$ the partial sums (which are continuous) converge uniformly, and so the limit $$f|_{B(a,r)}$$ is continuous. Any sequence converging to $$z_0$$ has some tail that is completely in the open ball $$B(a,r)\text{,}$$ hence $$f$$ is continuous at $$z_0\text{.}$$
In Corollary 6.2.13, we proved that we can differentiate real power series term by term. That is, we proved that if
\begin{equation*} f(x) \coloneqq \sum_{n=0}^\infty c_n {(x-a)}^n \end{equation*}
converges for real $$x$$ in an interval around $$a \in \R\text{,}$$ then we can differentiate term by term and obtain a series
\begin{equation*} f'(x) = \sum_{n=1}^\infty n c_n {(x-a)}^{n-1} = \sum_{n=0}^\infty (n+1)c_{n+1} {(x-a)}^{n} \end{equation*}
with the same radius of convergence. We only proved this theorem when $$c_n$$ is real, however, for complex $$c_n\text{,}$$ we write $$c_n = s_n + i t_n\text{,}$$ and as $$x$$ and $$a$$ are real
\begin{equation*} \sum_{n=0}^\infty c_n {(x-a)}^n = \sum_{n=0}^\infty s_n {(x-a)}^n + i \sum_{n=0}^\infty t_n {(x-a)}^n . \end{equation*}
We apply the theorem to the real and imaginary part.
By iterating this theorem, we find that an analytic function is infinitely differentiable:
\begin{equation*} f^{(\ell)}(x) = \sum_{n=\ell}^\infty n(n-1)\cdots(n-\ell+1)c_k {(x-a)}^{n-\ell} = \sum_{n=0}^\infty (n+\ell)(n+\ell-1)\cdots (n+1) c_{n+\ell} {(x-a)}^{n} . \end{equation*}
In particular,
$$f^{(\ell)}(a) = \ell! \, c_\ell .\tag{11.1}$$
The coefficients are uniquely determined by the derivatives of the function, and vice versa.
On the other hand, just because we have an infinitely differentiable function doesn’t mean that the numbers $$c_n$$ obtained by $$c_n = \frac{f^{(n)}(0)}{n!}$$ give a convergent power series. There is a theorem, which we will not prove, that given an arbitrary sequence $$\{ c_n \}_{n=1}^\infty\text{,}$$ there exists an infinitely differentiable function $$f$$ such that $$c_n = \frac{f^{(n)}(0)}{n!}\text{.}$$ Moreover, even if the obtained series converges, it may not converge to the function we started with. For an example, see Exercise 5.4.11: The function
\begin{equation*} f(x) \coloneqq \begin{cases} e^{-1/x} & \text{if } x > 0,\\ 0 & \text{if } x \leq 0, \end{cases} \end{equation*}
is infinitely differentiable, and all derivatives at the origin are zero. So its series at the origin would be just the zero series, and while that series converges, it does not converge to $$f$$ for $$x > 0\text{.}$$
We can apply an affine transformation $$z \mapsto z+a$$ that converts a power series at $$a$$ to a series at the origin. That is, if
\begin{equation*} f(z) = \sum_{n=0}^\infty c_n {(z-a)}^n, \qquad \text{we consider} \qquad f(z+a) = \sum_{n=0}^\infty c_n {z}^n. \end{equation*}
Therefore, it is usually sufficient to prove results about power series at the origin. From now on, we often assume $$a=0$$ for simplicity.

### Subsection11.3.4Power series as analytic functions

We need a theorem on swapping limits of series, that is, Fubini’s theorem for sums. For real series this was Exercise 2.6.15, but we have a slicker argument now.

#### Proof.

Let $$E$$ be the set $$\{ \nicefrac{1}{n} : n \in \N \} \cup \{ 0 \}\text{,}$$ and treat it as a metric space with the metric inherited from $$\R\text{.}$$ Define the sequence of functions $$f_k \colon E \to \C$$ by
\begin{equation*} f_k(\nicefrac{1}{n}) \coloneqq \sum_{m=1}^n a_{k,m} \qquad \text{and} \qquad f_k(0) \coloneqq \sum_{m=1}^\infty a_{k,m} . \end{equation*}
As the series converges, each $$f_k$$ is continuous at $$0$$ (since 0 is the only cluster point, they are continuous at every point of $$E\text{,}$$ but we don’t need that). For all $$x \in E\text{,}$$ we have
\begin{equation*} \sabs{f_k(x)} \leq \sum_{m=1}^\infty \sabs{a_{k,m}} . \end{equation*}
As $$\sum_k \sum_m \sabs{a_{k,m}}$$ converges (and does not depend on $$x$$), we know that
\begin{equation*} \sum_{k=1}^n f_k(x) \end{equation*}
converges uniformly on $$E\text{.}$$ Define
\begin{equation*} g(x) \coloneqq \sum_{k=1}^\infty f_k(x) , \end{equation*}
which is, therefore, a continuous function at $$0\text{.}$$ So
\begin{equation*} \begin{split} \sum_{k=1}^\infty \left( \sum_{m=1}^\infty a_{k,m} \right) & = \sum_{k=1}^\infty f_k(0) = g(0) = \lim_{n\to\infty} g(\nicefrac{1}{n}) \\ &= \lim_{n\to\infty}\sum_{k=1}^\infty f_k(\nicefrac{1}{n}) = \lim_{n\to\infty}\sum_{k=1}^\infty \sum_{m=1}^n a_{k,m} \\ &= \lim_{n\to\infty}\sum_{m=1}^n \sum_{k=1}^\infty a_{k,m} = \sum_{m=1}^\infty \left( \sum_{k=1}^\infty a_{k,m} \right) . \qedhere \end{split} \end{equation*}
Now we prove that once we have a series converging to a function in some interval, we can expand the function around every point.
The power series at $$a$$ could of course converge in a larger interval, but the one above is guaranteed. It is the largest symmetric interval about $$a$$ that fits in $$(-\rho,\rho)\text{.}$$

#### Proof.

Given $$a$$ and $$x$$ as in the theorem, write
\begin{equation*} \begin{split} f(x) &= \sum_{k=0}^\infty a_k {\bigl((x-a)+a\bigr)}^k \\ &= \sum_{k=0}^\infty a_k \sum_{m=0}^k \binom{k}{m} a^{k-m} {(x-a)}^m . \end{split} \end{equation*}
Define $$c_{k,m} \coloneqq a_k \binom{k}{m} a^{k-m}$$ if $$m \leq k$$ and $$0$$ if $$m > k\text{.}$$ Then
$$f(x) = \sum_{k=0}^\infty \, \sum_{m=0}^\infty c_{k,m} {(x-a)}^m .\tag{11.2}$$
Let us show that the double sum converges absolutely.
\begin{equation*} \begin{split} \sum_{k=0}^\infty \, \sum_{m=0}^\infty \abs{ c_{k,m} {(x-a)}^m} & = \sum_{k=0}^\infty \, \sum_{m=0}^k \abs{ a_k \binom{k}{m} a^{k-m} {(x-a)}^m } \\ & = \sum_{k=0}^\infty \sabs{a_k} \sum_{m=0}^k \binom{k}{m} \sabs{a}^{k-m} {\sabs{x-a}}^m \\ & = \sum_{k=0}^\infty \sabs{a_k} {\bigl(\sabs{x-a}+\sabs{a}\bigr)}^k , \end{split} \end{equation*}
and this series converges as long as $$(\sabs{x-a}+\sabs{a}) < \rho$$ or in other words if $$\sabs{x-a} < \rho-\sabs{a}\text{.}$$
Using Theorem 11.3.5, swap the order of summation in (11.2), and the following series converges when $$\sabs{x-a} < \rho-\sabs{a}\text{:}$$
\begin{equation*} f(x) = \sum_{k=0}^\infty \, \sum_{m=0}^\infty c_{k,m} {(x-a)}^m = \sum_{m=0}^\infty \left( \sum_{k=0}^\infty c_{k,m} \right) {(x-a)}^m . \end{equation*}
The formula in terms of derivatives at $$a$$ follows by differentiating the series to obtain (11.1).
Note that if a series converges for real $$x \in (a-\rho,a+\rho)$$ it also converges for all complex numbers in $$B(a,\rho)\text{.}$$ We have the following corollary, which says that functions defined by power series are analytic.

#### Proof.

Without loss of generality assume that $$a=0\text{.}$$ We can rotate to assume that $$b$$ is real, but since that is harder to picture, let us do it explicitly. Let $$\alpha \coloneqq \frac{\bar{b}}{\sabs{b}}\text{.}$$ Notice that
\begin{equation*} \abs{\nicefrac{1}{\alpha}} = \sabs{\alpha} = 1 . \end{equation*}
Therefore the series $$\sum_{k=0}^\infty c_k {(\nicefrac{z}{\alpha})}^k = \sum_{k=0}^\infty c_k \alpha^{-k} {z}^k$$ converges to $$f(\nicefrac{z}{\alpha})$$ in $$B(0,\rho)\text{.}$$ When $$z=x$$ is real we apply Theorem 11.3.6 at $$\sabs{b}$$ and get a series that converges to $$f(\nicefrac{z}{\alpha})$$ on $$B(\sabs{b},\rho-\sabs{b})\text{.}$$ That is, there is a convergent series
\begin{equation*} f(\nicefrac{z}{\alpha}) = \sum_{k=0}^\infty a_k {\bigl(z - \sabs{b}\bigr)}^k . \end{equation*}
Using $$\alpha b = \sabs{b}\text{,}$$ we find
\begin{equation*} f(z) = f(\nicefrac{\alpha z}{\alpha}) = \sum_{k=0}^\infty a_k {(\alpha z - \sabs{b})}^k = \sum_{k=0}^\infty a_k\alpha^k {\bigl(z - \nicefrac{\sabs{b}}{\alpha}\bigr)}^k = \sum_{k=0}^\infty a_k\alpha^k {(z - b)}^k , \end{equation*}
and this series converges for all $$z$$ such that $$\bigl\lvert \alpha z-\sabs{b}\bigr\rvert < \rho-\sabs{b}$$ or $$\sabs{z - b} < \rho-\sabs{b}\text{.}$$
We proved above that a convergent power series is an analytic function where it converges. We have also shown before that $$\frac{1}{1-z}$$ is analytic outside of $$z=1\text{.}$$
Note that just because a real analytic function is analytic on the entire real line it does not necessarily mean that it has a power series representation that converges everywhere. For example, the function
\begin{equation*} f(x) = \frac{1}{1+x^2} \end{equation*}
happens to be real analytic function on $$\R$$ (exercise). A power series around the origin converging to $$f$$ has a radius of convergence of exactly $$1\text{.}$$ Can you see why? (exercise)

### Subsection11.3.5Identity theorem for analytic functions

#### Proof.

By continuity we know $$f(0) = 0$$ so $$a_0 = 0\text{.}$$ Suppose there exists some nonzero $$a_k\text{.}$$ Let $$m$$ be the smallest $$m$$ such that $$a_m \not= 0\text{.}$$ Then
\begin{equation*} f(z) = \sum_{k=m}^\infty a_k z^k = z^m \sum_{k=m}^\infty a_k z^{k-m} = z^m \sum_{k=0}^\infty a_{k+m} z^{k} . \end{equation*}
Write $$g(z) = \sum_{k=0}^\infty a_{k+m} z^{k}$$ (this series converges in on the same set as $$f$$). $$g$$ is continuous and $$g(0) = a_m \not= 0\text{.}$$ Thus there exists some $$\delta > 0$$ such that $$g(z) \not= 0$$ for all $$z \in B(0,\delta)\text{.}$$ As $$f(z) = z^m g(z)\text{,}$$ the only point in $$B(0,\delta)$$ where $$f(z) = 0$$ is when $$z=0\text{,}$$ but this contradicts the assumption that $$f(z_n) = 0$$ for all $$n\text{.}$$
Recall that in a metric space $$X\text{,}$$ a cluster point (or sometimes limit point) of a set $$E$$ is a point $$p \in X$$ such that $$B(p,\epsilon) \setminus \{ p \}$$ contains points of $$E$$ for all $$\epsilon > 0\text{.}$$
In most common applications of this theorem $$E$$ is an open set or perhaps a curve.

#### Proof.

Without loss of generality suppose $$E$$ is the set of all points $$z \in U$$ such that $$g(z)=f(z)\text{.}$$ Note that $$E$$ must be closed as $$f$$ and $$g$$ are continuous.
Suppose $$E$$ has a cluster point. Without loss of generality assume that $$0$$ is this cluster point. Near $$0\text{,}$$ we have the expansions
\begin{equation*} f(z) = \sum_{k=0}^\infty a_k {z}^k \qquad \text{and} \qquad g(z) = \sum_{k=0}^\infty b_k {z}^k , \end{equation*}
which converge in some ball $$B(0,\rho)\text{.}$$ Therefore the series
\begin{equation*} 0 = f(z)-g(z) = \sum_{k=0}^\infty (a_k-b_k) z^k \end{equation*}
converges in $$B(0,\rho)\text{.}$$ As $$0$$ is a cluster point of $$E\text{,}$$ there is a sequence of nonzero points $$\{ z_n \}_{n=1}^\infty$$ such that $$f(z_n) -g(z_n) = 0\text{.}$$ Hence, by the lemma above $$a_k = b_k$$ for all $$k\text{.}$$ Therefore, $$B(0,\rho) \subset E\text{.}$$
Thus the set of cluster points of $$E$$ is open. The set of cluster points of $$E$$ is also closed: A limit of cluster points of $$E$$ is in $$E$$ as it is closed, and it is clearly a cluster point of $$E\text{.}$$ As $$U$$ is connected, the set of cluster points of $$E$$ is equal to $$U\text{,}$$ or in other words $$E = U\text{.}$$
By restricting our attention to real $$x\text{,}$$ we obtain the same theorem for connected open subsets of $$\R\text{,}$$ which are just open intervals.

### Subsection11.3.6Exercises

#### Exercise11.3.1.

Let
\begin{equation*} a_{k,m} \coloneqq \begin{cases} 1 & \text{if } k=m,\\ -2^{k-m} & \text{if } k<m,\\ 0 & \text{if } k>m. \end{cases} \end{equation*}
Compute (or show the limit doesn’t exist):
a) $$\displaystyle \sum_{m=1}^\infty \sabs{a_{k,m}}~$$ for all $$k\text{,}$$        b) $$\displaystyle \sum_{k=1}^\infty \sabs{a_{k,m}}~$$ for all $$m\text{,}$$        c) $$\displaystyle \sum_{k=1}^\infty \sum_{m=1}^\infty \sabs{a_{k,m}}\text{,}$$        d) $$\displaystyle \sum_{k=1}^\infty \sum_{m=1}^\infty a_{k,m}\text{,}$$        e) $$\displaystyle \sum_{m=1}^\infty \sum_{k=1}^\infty a_{k,m}\text{.}$$
Hint: Fubini for sums does not apply, in fact, answers to d) and e) are different.

#### Exercise11.3.2.

Let $$f(x) \coloneqq \frac{1}{1+x^2}\text{.}$$ Prove that
1. $$f$$ is analytic function on all of $$\R$$ by finding a power series for $$f$$ at every $$a \in \R\text{,}$$
2. the radius of convergence of the power series for $$f$$ at the origin is 1.

#### Exercise11.3.3.

Suppose $$f \colon \C \to \C$$ is analytic. Show that for each $$n\text{,}$$ there are at most finitely many zeros of $$f$$ in $$B(0,n)\text{,}$$ that is, $$f^{-1}(0) \cap B(0,n)$$ is finite for each $$n\text{.}$$

#### Exercise11.3.4.

Suppose $$U \subset \C$$ is open and connected, $$0 \in U\text{,}$$ and $$f \colon U \to \C$$ is analytic. Treating $$f$$ as a function of a real $$x$$ at the origin, suppose $$f^{(n)}(0) = 0$$ for all $$n\text{.}$$ Show that $$f(z) = 0$$ for all $$z \in U\text{.}$$

#### Exercise11.3.5.

Suppose $$U \subset \C$$ is open and connected, $$0 \in U\text{,}$$ and $$f \colon U \to \C$$ is analytic. For real $$x$$ and $$y\text{,}$$ let $$h(x) \coloneqq f(x)$$ and $$g(y) \coloneqq -i \, f(iy)\text{.}$$ Show that $$h$$ and $$g$$ are infinitely differentiable at the origin and $$h'(0) = g'(0)\text{.}$$

#### Exercise11.3.6.

Suppose a function $$f$$ is analytic in some neighborhood of the origin, and that there exists an $$M$$ such that $$\sabs{f^{(n)}(0)} \leq M$$ for all $$n\text{.}$$ Prove that the series of $$f$$ at the origin converges for all $$z \in \C\text{.}$$

#### Exercise11.3.7.

Suppose $$f(z) \coloneqq \sum_{n=0}^\infty c_n z^n$$ with a radius of convergence 1. Suppose $$f(0) = 0\text{,}$$ but $$f$$ is not the zero function. Show that there exists a $$k \in \N$$ and a convergent power series $$g(z) \coloneqq \sum_{n=0}^\infty d_n z^n$$ with radius of convergence 1 such that $$f(z) = z^k g(z)$$ for all $$z \in B(0,1)\text{,}$$ and $$g(0) \not= 0\text{.}$$

#### Exercise11.3.8.

Suppose $$U \subset \C$$ is open and connected. Suppose that $$f \colon U \to \C$$ is analytic, $$U \cap \R \not= \emptyset$$ and $$f(x) = 0$$ for all $$x \in U \cap \R\text{.}$$ Show that $$f(z) = 0$$ for all $$z \in U\text{.}$$

#### Exercise11.3.9.

For $$\alpha \in \C$$ and $$k=0,1,2,3\ldots\text{,}$$ define
\begin{equation*} \binom{\alpha}{k} \coloneqq \frac{\alpha(\alpha-1)\cdots(\alpha-k)}{k!} . \end{equation*}
1. Show that the series
\begin{equation*} f(z) \coloneqq \sum_{k=0}^\infty \binom{\alpha}{k} z^k \end{equation*}
converges whenever $$\abs{z} < 1\text{.}$$ In fact, prove that for $$\alpha = 0,1,2,3,\ldots$$ the radius of convergence is $$\infty\text{,}$$ and for all other $$\alpha$$ the radius of convergence is 1.
2. Show that for $$x \in \R\text{,}$$ $$\abs{x} < 1\text{,}$$ we have
\begin{equation*} (1+x) f'(x) = \alpha f(x) , \end{equation*}
meaning that $$f(x) = (1+x)^\alpha\text{.}$$

#### Exercise11.3.10.

Suppose $$f \colon \C \to \C$$ is analytic and suppose that for some open interval $$(a,b) \subset \R\text{,}$$ $$f$$ is real valued on $$(a,b)\text{.}$$ Show that $$f$$ is real-valued on $$\R\text{.}$$

#### Exercise11.3.11.

Let $$\D \coloneqq B(0,1)$$ be the unit disc. Suppose $$f \colon \D \to \C$$ is analytic with power series $$\sum_{n=0}^\infty c_n z^n\text{.}$$ Suppose $$\sabs{c_n} \leq 1$$ for all $$n\text{.}$$ Prove that for all $$z \in \D\text{,}$$ we have $$\sabs{f(z)} \leq \frac{1}{1-\sabs{z}}\text{.}$$
For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf