Section 11.3 Power series and analytic functions
Note: 2–3 lectures
Subsection 11.3.1 Analytic functions
A (complex) power series is a series of the form
for \(c_n, z, a \in \C\text{.}\) We say the series converges if the series converges for some \(z \not= a\text{.}\)
Let \(U \subset \C\) be an open set and \(f \colon U \to \C\) a function. Suppose that for every \(a \in U\) there exists a \(\rho > 0\) and a power series convergent to the function
for all \(z \in B(a,\rho)\text{.}\) Then we say \(f\) is an analytic function.
Similarly if we have an interval \((a,b) \subset \R\text{,}\) we say that \(f \colon (a,b) \to \C\) is analytic or perhaps realanalytic if for each point \(c \in (a,b)\) there is a power series around \(c\) that converges in some \((c\rho,c+\rho)\) for some \(\rho > 0\text{.}\)
As we will sometimes talk about real and sometimes about complex power series we will use \(z\) to denote a complex number and \(x\) a real number, but we will always mention which case we are working with.
An analytic function has different expansions around different points. Also the convergence does not automatically happen on the entire domain of the function. For example, if \(\sabs{z} < 1\text{,}\) then
While the lefthand side exists on all of \(z \not= 1\text{,}\) the righthand side happens to converge only if \(\sabs{z} < 1\text{.}\) See a graph of a small piece of \(\frac{1}{1z}\) in Figure 11.5. Notice that we can't graph the function itself, we can only graph its real or imaginary parts for lack of dimensions in our universe.
Subsection 11.3.2 Convergence of power series
We proved several results for power series of a real variable in Section 2.6. For the most part the convergence properties of power series deal with the series \(\sum \sabs{c_k} \, \sabs{za}^k\) and so we have already proved many results about complex power series. In particular, we computed the socalled radius of convergence of a power series.
Proposition 11.3.1.
Let \(\sum_{n=0}^\infty c_n {(za)}^n\) be a power series. There exists a \(\rho \in [0,\infty]\) such that
If \(\rho = 0\text{,}\) then the series diverges.
If \(\rho = \infty\text{,}\) then the series converges for all \(z \in \C\text{.}\)
If \(0 < \rho < \infty\text{,}\) then the series converges absolutely on \(B(a,\rho)\text{,}\) and diverges when \(\sabs{za} > \rho\text{.}\)
Furthermore, if \(0 < r < \rho\text{,}\) then the series converges uniformly on the closed ball \(C(a,r)\text{.}\)
Proof.
We use the real version of this proposition, Proposition 2.6.10. Let
If \(R = 0\text{,}\) then \(\sum_{n=0}^\infty \sabs{c_n} \, \sabs{za}^n\) converges for all \(z\text{.}\) If \(R = \infty\text{,}\) then \(\sum_{n=0}^\infty \sabs{c_n} \, \sabs{za}^n\) converges only at \(z=a\text{.}\) Otherwise, let \(\rho := \nicefrac{1}{R}\) and \(\sum_{n=0}^\infty \sabs{c_n} \, \sabs{za}^n\) converges when \(\sabs{za} < \rho\text{,}\) and diverges (in fact the terms of the series do not go to zero) when \(\sabs{za} > \rho\text{.}\)
To prove the furthermore suppose \(0 < r < \rho\) and \(z \in C(a,r)\text{.}\) Then consider the partial sums
The number \(\rho\) is called the radius of convergence. See Figure 11.6. The radius of convergence gives us a disk around \(a\) where the series converges. A power series is convergent if \(\rho > 0\text{.}\)
If \(\sum c_n {(za)}^n\) converges for some \(z\text{,}\) then
converges absolutely whenever \(\sabs{wa} < \sabs{za}\text{.}\) Conversely if the series diverges at \(z\text{,}\) then it must diverge at \(w\) whenever \(\sabs{wa} > \sabs{za}\text{.}\) This means that to show that the radius of convergence is at least some number, we simply need to show convergence at some point by any method we know.
Example 11.3.2.
Let us list some series we already know:
Example 11.3.3.
Note the difference between \(\frac{1}{1z}\) and its power series. Let us expand \(\frac{1}{1z}\) as power series around a point \(a \not= 1\text{.}\) Let \(c := \frac{1}{1a}\text{,}\) then
The series \(\sum c^n {(za)}^n\) converges if and only if the series on the righthand side converges and
The radius of convergence of the power series is \(\sabs{1a}\text{,}\) that is the distance from \(1\) to \(a\text{.}\) The function \(\frac{1}{1z}\) has a power series representation around every \(a\not= 1\) and so is analytic in \(\C \setminus \{ 1 \}\text{.}\) The domain of the function is bigger than the region of convergence of the power series representing the function at any point.
It turns out that if a function has a power series representation converging to the function on some ball, then it has a power series representation at every point in the ball. We will prove this result later.
Subsection 11.3.3 Properties of analytic functions
Proposition 11.3.4.
If
is convergent in \(B(a,\rho)\) for some \(\rho > 0\text{,}\) then \(f \colon B(a,\rho) \to \C\) is continuous. In particular, analytic functions are continuous.
Proof.
For \(z_0 \in B(a,\rho)\text{,}\) pick \(r < \rho\) such that \(z_0 \in B(a,r)\text{.}\) On \(B(a,r)\) the partial sums (which are continuous) converge uniformly, and so the limit \(f_{B(a,r)}\) is continuous. Any sequence converging to \(z_0\) has some tail that is completely in the open ball \(B(a,r)\text{,}\) hence \(f\) is continuous at \(z_0\text{.}\)
In Corollary 6.2.13 we proved that we can differentiate real power series term by term. That is we proved that if
converges for real \(x\) in an interval around \(a \in \R\text{,}\) then we can differentiate term by term and obtain a series
with the same radius of convergence. We only proved this theorem when \(c_n\) is real, however, for complex \(c_n\text{,}\) we write \(c_n = s_n + i t_n\text{,}\) and as \(x\) and \(a\) are real
We apply the theorem to the real and imaginary part.
By iterating this theorem we find that an analytic function is infinitely differentiable:
In particular,
So the coefficients are uniquely determined by the derivatives of the function, and vice versa.
On the other hand, just because we have an infinitely differentiable function doesn't mean that the numbers \(c_n\) obtained by \(c_n = \frac{f^{(n)}(0)}{n!}\) give a convergent power series. There is a theorem, which we will not prove, that given an arbitrary sequence \(\{ c_n \}\text{,}\) there exists an infinitely differentiable function \(f\) such that \(c_n = \frac{f^{(n)}(0)}{n!}\text{.}\) Moreover, even if the obtained series converges it may not converge to the function we started with. For an example, see Exercise 5.4.11: The function
is infinitely differentiable, and all derivatives at the origin are zero. So its series at the origin would be just the zero series, and while that series converges, it does not converge to \(f\) for \(x > 0\text{.}\)
Note that we can always apply an affine transformation \(z \mapsto z+a\) that converts a power series to a series at the origin. That is, if
Therefore it is usually sufficient to prove results about power series at the origin. From now on, we often assume \(a=0\) for simplicity.
Subsection 11.3.4 Power series as analytic functions
We need a theorem on swapping limits of series, that is, Fubini's theorem for sums.
Theorem 11.3.5. Fubini for sums.
Let \(\{ a_{kj} \}_{k=1,j=1}^\infty\) be a double sequence of complex numbers and suppose that for every \(k\) the series
and furthermore that
Then
where all the series involved converge.
Proof.
Let \(E\) be the set \(\{ \nicefrac{1}{n} : n \in \N \} \cup \{ 0 \}\text{,}\) and treat it as a metric space with the metric inherited from \(\R\text{.}\) Define the sequence of functions \(f_k \colon E \to \C\) by
As the series converges, each \(f_k\) is continuous at \(0\) (since 0 is the only cluster point, they are continuous at every point of \(E\text{,}\) but we don't need that). For all \(x \in E\text{,}\) we have
By knowing that \(\sum_k \sum_j \sabs{a_{kj}}\) converges (and does not depend on \(x\)), we know that
converges uniformly on \(E\text{.}\) Define
which is therefore a continuous function at \(0\text{.}\) So
Now we prove that once we have a series converging to a function in some interval, we can expand the function around every point.
Theorem 11.3.6. Taylor's theorem for realanalytic functions.
Let
be a power series converging in \((\rho,\rho)\) for some \(\rho > 0\text{.}\) Given any \(a \in (\rho,\rho)\text{,}\) and \(x\) such that \(\sabs{xa} < \rho\sabs{a}\text{,}\) we obtain
The power series at \(a\) could of course converge in a larger interval, but the one above is guaranteed. It is the largest symmetric interval about \(a\) that fits in \((\rho,\rho)\text{.}\)
Proof.
Given \(a\) and \(x\) as in the theorem, write
Define \(c_{k,m} := a_k \binom{k}{m} a^{km}\) if \(m \leq k\) and \(0\) if \(m > k\text{.}\) Then
Let us show that the double sum converges absolutely.
and this series converges as long as \((\sabs{xa}+\sabs{a}) < \rho\) or in other words if \(\sabs{xa} < \rho\sabs{a}\text{.}\)
Using Theorem 11.3.5, swap the order of summation in (11.2), and the following series converges when \(\sabs{xa} < \rho\sabs{a}\text{:}\)
The formula in terms of derivatives at \(a\) follows by differentiating the series to obtain (11.1).
Note that if a series converges for real \(x \in (a\rho,a+\rho)\) it also converges for all complex numbers in \(B(a,\rho)\text{.}\) We have the following corollary, which says that functions defined by power series are analytic.
Corollary 11.3.7.
For every \(a \in \C\text{,}\) if \(\sum c_k {(za)}^k\) converges to \(f(z)\) in \(B(a,\rho)\) and \(b \in B(a,\rho)\text{,}\) then there exists a power series \(\sum d_k {(zb)}^k\) that converges to \(f(z)\) in \(B(b,\rho\sabs{ba})\text{.}\)
Proof.
Without loss of generality assume that \(a=0\text{.}\) We can rotate to assume that \(b\) is real, but since that is harder to picture, let us do it explicitly. Let \(\alpha := \frac{\bar{b}}{\sabs{b}}\text{.}\) Notice that
Therefore the series \(\sum c_k {(\nicefrac{z}{\alpha})}^k = \sum c_k \alpha^{k} {z}^k\) converges to \(f(\nicefrac{z}{\alpha})\) in \(B(0,\rho)\text{.}\) When \(z=x\) is real we apply Theorem 11.3.6 at \(\sabs{b}\) and get a series that converges to \(f(\nicefrac{z}{\alpha})\) on \(B(\sabs{b},\rho\sabs{b})\text{.}\) That is, there is a convergent series
Using \(\alpha b = \sabs{b}\text{,}\) we find
and this series converges for all \(z\) such that \(\bigl\lvert \alpha z\sabs{b}\bigr\rvert < \rho\sabs{b}\) or \(\sabs{z  b} < \rho\sabs{b}\text{.}\)
We proved above that a convergent power series is an analytic function where it converges. We have also shown before that \(\frac{1}{1z}\) is analytic outside of \(z=1\text{.}\)
Note that just because a real analytic function is analytic on the entire real line it does not necessarily mean that it has a power series representation that converges everywhere. For example, the function
happens to be real analytic function on \(\R\) (exercise). A power series around the origin converging to \(f\) has a radius of convergence of exactly \(1\text{.}\) Can you see why? (exercise)
Subsection 11.3.5 Identity theorem for analytic functions
Lemma 11.3.8.
Suppose \(f(z) = \sum a_k z^k\) is a convergent power series and \(\{ z_n \}\) is a sequence of nonzero complex numbers converging to 0, such that \(f(z_n) = 0\) for all \(n\text{.}\) Then \(a_k = 0\) for every \(k\text{.}\)
Proof.
By continuity we know \(f(0) = 0\) so \(a_0 = 0\text{.}\) Suppose there exists some nonzero \(a_k\text{.}\) Let \(m\) be the smallest \(m\) such that \(a_m \not= 0\text{.}\) Then
Write \(g(z) = \sum_{k=0}^\infty a_{k+m} z^{k}\) (this series converges in on the same set as \(f\)). \(g\) is continuous and \(g(0) = a_m \not= 0\text{.}\) Thus there exists some \(\delta > 0\) such that \(g(z) \not= 0\) for all \(z \in B(0,\delta)\text{.}\) As \(f(z) = z^m g(z)\text{,}\) the only point in \(B(0,\delta)\) where \(f(z) = 0\) is when \(z=0\text{,}\) but this contradicts the assumption that \(f(z_n) = 0\) for all \(n\text{.}\)
Recall that in a metric space \(X\text{,}\) a cluster point (or sometimes limit point) of a set \(E\) is a point \(p \in X\) such that \(B(p,\epsilon) \setminus \{ p \}\) contains points of \(E\) for all \(\epsilon > 0\text{.}\)
Theorem 11.3.9. Identity theorem.
Let \(U \subset \C\) be open and connected. If \(f \colon U \to \C\) and \(g \colon U \to \C\) are analytic functions that are equal on a set \(E \subset U\text{,}\) and \(E\) has a cluster point in \(U\text{,}\) then \(f(z) = g(z)\) for all \(z \in U\text{.}\)
In most common applications of this theorem \(E\) is an open set or perhaps a curve.
Proof.
Without loss of generality suppose \(E\) is the set of all points \(z \in U\) such that \(g(z)=f(z)\text{.}\) Note that \(E\) must be closed as \(f\) and \(g\) are continuous.
Suppose \(E\) has a cluster point. Without loss of generality assume that \(0\) is this cluster point. Near \(0\text{,}\) we have the expansions
which converge in some ball \(B(0,\rho)\text{.}\) Therefore the series
converges in \(B(0,\rho)\text{.}\) As \(0\) is a cluster point of \(E\text{,}\) there is a sequence of nonzero points \(\{ z_n \}\) such that \(f(z_n) g(z_n) = 0\text{.}\) Hence, by the lemma above \(a_k = b_k\) for all \(k\text{.}\) Therefore, \(B(0,\rho) \subset E\text{.}\)
Thus the set of cluster points of \(E\) is open. The set of cluster points of \(E\) is also closed: A limit of cluster points of \(E\) is in \(E\) as it is closed, and it is clearly a cluster point of \(E\text{.}\) As \(U\) is connected, the set of cluster points of \(E\) is equal to \(U\text{,}\) or in other words \(E = U\text{.}\)
By restricting our attention to real \(x\text{,}\) we obtain the same theorem for connected open subsets of \(\R\text{,}\) which are just open intervals.
Subsection 11.3.6 Exercises
Exercise 11.3.1.
Let
Compute (or show the limit doesn't exist):
a) \(\displaystyle \sum_{j=1}^\infty \sabs{a_{kj}}~\) for every \(k\text{,}\) b) \(\displaystyle \sum_{k=1}^\infty \sabs{a_{kj}}~\) for every \(j\text{,}\) c) \(\displaystyle \sum_{k=1}^\infty \sum_{j=1}^\infty \sabs{a_{kj}}\text{,}\) d) \(\displaystyle \sum_{k=1}^\infty \sum_{j=1}^\infty a_{kj}\text{,}\) e) \(\displaystyle \sum_{j=1}^\infty \sum_{k=1}^\infty a_{kj}\text{.}\)
Hint: Fubini for sums does not apply, in fact, answers to d) and e) are different.
Exercise 11.3.2.
Let \(f(x) := \frac{1}{1+x^2}\text{.}\) Prove that
\(f\) is analytic function on all of \(\R\) by finding a power series for \(f\) at every \(a \in \R\text{,}\)
the radius of convergence of the power series for \(f\) at the origin is 1.
Exercise 11.3.3.
Suppose \(f \colon \C \to \C\) is analytic. Show that for each \(n\text{,}\) there are at most finitely many zeros of \(f\) in \(B(0,n)\text{,}\) that is, \(f^{1}(0) \cap B(0,n)\) is finite for each \(n\text{.}\)
Exercise 11.3.4.
Suppose \(U \subset \C\) is open and connected, \(0 \in U\text{,}\) and \(f \colon U \to \C\) is analytic. Treating \(f\) as a function of a real \(x\) at the origin, suppose \(f^{(n)}(0) = 0\) for all \(n\text{.}\) Show that \(f(z) = 0\) for all \(z \in U\text{.}\)
Exercise 11.3.5.
Suppose \(U \subset \C\) is open and connected, \(0 \in U\text{,}\) and \(f \colon U \to \C\) is analytic. For real \(x\) and \(y\text{,}\) let \(h(x) := f(x)\) and \(g(y) := i \, f(iy)\text{.}\) Show that \(h\) and \(g\) are infinitely differentiable at the origin and \(h'(0) = g'(0)\text{.}\)
Exercise 11.3.6.
Suppose a function \(f\) is analytic in some neighborhood of the origin, and that there exists an \(M\) such that \(\sabs{f^{(n)}(0)} \leq M\) for all \(n\text{.}\) Prove that the series of \(f\) at the origin converges for all \(z \in \C\text{.}\)
Exercise 11.3.7.
Suppose \(f(z) := \sum c_n z^n\) with a radius of convergence 1. Suppose \(f(0) = 0\text{,}\) but \(f\) is not the zero function. Show that there exists a \(k \in \N\) and a convergent power series \(g(z) := \sum d_n z^n\) with radius of convergence 1 such that \(f(z) = z^k g(z)\) for all \(z \in B(0,1)\text{,}\) and \(g(0) \not= 0\text{.}\)
Exercise 11.3.8.
Suppose \(U \subset \C\) is open and connected. Suppose that \(f \colon U \to \C\) is analytic, \(U \cap \R \not= \emptyset\) and \(f(x) = 0\) for all \(x \in U \cap \R\text{.}\) Show that \(f(z) = 0\) for all \(z \in U\text{.}\)
Exercise 11.3.9.
For \(\alpha \in \C\) and \(k=0,1,2,3\ldots\text{,}\) define

Show that the series
\begin{equation*} f(z) := \sum_{k=0}^\infty \binom{\alpha}{k} z^k \end{equation*}converges whenever \(\abs{z} < 1\text{.}\) In fact, prove that for \(\alpha = 0,1,2,3,\ldots\) the radius of convergence is \(\infty\text{,}\) and for all other \(\alpha\) the radius of convergence is 1.

Show that for \(x \in \R\text{,}\) \(\abs{x} < 1\text{,}\) we have
\begin{equation*} (1+x) f'(x) = \alpha f(x) , \end{equation*}meaning that \(f(x) = (1+x)^\alpha\text{.}\)
Exercise 11.3.10.
Suppose \(f \colon \C \to \C\) is analytic and suppose that for some open interval \((a,b) \subset \R\text{,}\) \(f\) is real valued on \((a,b)\text{.}\) Show that \(f\) is realvalued on \(\R\text{.}\)
Exercise 11.3.11.
Let \(\D := B(0,1)\) be the unit disc. Suppose \(f \colon \D \to \C\) is analytic with power series \(\sum c_n z^n\text{.}\) Suppose \(\sabs{c_n} \leq 1\) for all \(n\text{.}\) Prove that for all \(z \in \D\text{,}\) we have \(\sabs{f(z)} \leq \frac{1}{1\sabs{z}}\text{.}\)