## Section9.1Differentiation under the integral

Note: less than 1 lecture

Let $$f(x,y)$$ be a function of two variables and define

\begin{equation*} g(y) := \int_a^b f(x,y) \,dx . \end{equation*}

If $$f$$ is continuous on the compact rectangle $$[a,b] \times [c,d]\text{,}$$ then Proposition 7.5.12 says that $$g$$ is continuous on $$[c,d]\text{.}$$

Suppose $$f$$ is differentiable in $$y\text{.}$$ The main question we want to ask is when can we “differentiate under the integral,” that is, when is it true that $$g$$ is differentiable and its derivative is

\begin{equation*} g'(y) \overset{?}{=} \int_a^b \frac{\partial f}{\partial y}(x,y) \,dx . \end{equation*}

Differentiation is a limit and therefore we are really asking when do the two limiting operations of integration and differentiation commute. This is not always possible and some extra hypothesis is necessary. The first question we would face is the integrability of $$\frac{\partial f}{\partial y}\text{,}$$ but the formula above can fail even if $$\frac{\partial f}{\partial y}$$ is integrable as a function of $$x$$ for every fixed $$y\text{.}$$

We prove a simple, but perhaps the most useful version of this kind of result.

The hypotheses on $$f$$ and $$\frac{\partial f}{\partial y}$$ can be weakened, see e.g. Exercise 9.1.8, but not dropped outright. The main point in the proof requires that $$\frac{\partial f}{\partial y}$$ exists and is continuous for all $$x$$ up to the endpoints, but we only need a small interval in the $$y$$ direction. In applications, we often make $$[c,d]$$ a small interval around the point where we need to differentiate.

### Proof.

Fix $$y \in [c,d]$$ and let $$\epsilon > 0$$ be given. As $$\frac{\partial f}{\partial y}$$ is continuous on $$[a,b] \times [c,d]$$ it is uniformly continuous. In particular, there exists $$\delta > 0$$ such that whenever $$y_1 \in [c,d]$$ with $$\abs{y_1-y} < \delta$$ and all $$x \in [a,b]\text{,}$$ we have

\begin{equation*} \abs{\frac{\partial f}{\partial y}(x,y_1)-\frac{\partial f}{\partial y}(x,y)} < \epsilon . \end{equation*}

Suppose $$h$$ is such that $$y+h \in [c,d]$$ and $$\abs{h} < \delta\text{.}$$ Fix $$x$$ for a moment and apply the mean value theorem to find a $$y_1$$ between $$y$$ and $$y+h$$ such that

\begin{equation*} \frac{f(x,y+h)-f(x,y)}{h} = \frac{\partial f}{\partial y}(x,y_1) . \end{equation*}

As $$\abs{y_1-y} \leq \abs{h} < \delta\text{,}$$

\begin{equation*} \abs{ \frac{f(x,y+h)-f(x,y)}{h} - \frac{\partial f}{\partial y}(x,y) } = \abs{ \frac{\partial f}{\partial y}(x,y_1) - \frac{\partial f}{\partial y}(x,y) } < \epsilon . \end{equation*}

The argument worked for every $$x \in [a,b]$$ (different $$y_1$$ may have been used). Thus, as a function of $$x$$

\begin{equation*} x \mapsto \frac{f(x,y+h)-f(x,y)}{h} \qquad \text{converges uniformly to} \qquad x \mapsto \frac{\partial f}{\partial y}(x,y) \qquad \text{as } h \to 0 . \end{equation*}

We defined uniform convergence for sequences although the idea is the same. You may replace $$h$$ with a sequence of nonzero numbers $$\{ h_n \}$$ converging to $$0$$ such that $$y+h_n \in [c,d]$$ and let $$n \to \infty\text{.}$$

Consider the difference quotient of $$g\text{,}$$

\begin{equation*} \frac{g(y+h)-g(y)}{h} = \frac{\int_a^b f(x,y+h) \,dx - \int_a^b f(x,y) \,dx }{h} = \int_a^b \frac{f(x,y+h)-f(x,y)}{h} \,dx . \end{equation*}

Uniform convergence implies the limit can be taken underneath the integral. So

\begin{equation*} \lim_{h\to 0} \frac{g(y+h)-g(y)}{h} = \int_a^b \lim_{h\to 0} \frac{f(x,y+h)-f(x,y)}{h} \,dx = \int_a^b \frac{\partial f}{\partial y}(x,y) \,dx . \end{equation*}

Then $$g'$$ is continuous on $$[c,d]$$ by Proposition 7.5.12 mentioned above.

### Example9.1.2.

Let

\begin{equation*} f(y) = \int_0^1 \sin(x^2-y^2) \,dx . \end{equation*}

Then

\begin{equation*} f'(y) = \int_0^1 -2y\cos(x^2-y^2) \,dx . \end{equation*}

### Example9.1.3.

Consider

\begin{equation*} \int_0^{1} \frac{x-1}{\ln(x)} \,dx . \end{equation*}

The function under the integral extends to be continuous on $$[0,1]\text{,}$$ and hence the integral exists, see Exercise 9.1.1. Trouble is finding it. We introduce a parameter $$y$$ and define a function:

\begin{equation*} g(y) := \int_0^{1} \frac{x^y-1}{\ln(x)} \,dx . \end{equation*}

The function $$\frac{x^y-1}{\ln(x)}$$ also extends to a continuous function of $$x$$ and $$y$$ for $$(x,y) \in [0,1] \times [0,1]$$ (also part of the exercise). See Figure 9.1. Figure 9.1. The graph $$z= \frac{x^y-1}{\ln(x)}$$ on $$[0,1] \times [0,1]\text{.}$$

Hence, $$g$$ is a continuous function on $$[0,1]$$ and $$g(0) = 0\text{.}$$ For every $$\epsilon > 0\text{,}$$ the $$y$$ derivative of the integrand, $$x^y\text{,}$$ is continuous on $$[0,1] \times [\epsilon,1]\text{.}$$ Therefore, for $$y >0\text{,}$$ we may differentiate under the integral sign,

\begin{equation*} g'(y) = \int_0^{1} \frac{\ln(x) x^y}{\ln(x)} \,dx = \int_0^{1} x^y \,dx = \frac{1}{y+1} . \end{equation*}

We need to figure out $$g(1)$$ given that $$g'(y) = \frac{1}{y+1}$$ and $$g(0) = 0\text{.}$$ Elementary calculus says that $$g(1) = \int_0^1 g'(y)\,dy = \ln(2)\text{.}$$ Thus,

\begin{equation*} \int_0^{1} \frac{x-1}{\ln(x)} \,dx = \ln(2). \end{equation*}

### Subsection9.1.1Exercises

#### Exercise9.1.1.

Prove the two statements that were asserted in Example 9.1.3:

1. Prove $$\frac{x-1}{\ln(x)}$$ extends to a continuous function of $$[0,1]\text{.}$$ That is, there exists a continuous function on $$[0,1]$$ that equals $$\frac{x-1}{\ln(x)}$$ on $$(0,1)\text{.}$$

2. Prove $$\frac{x^y-1}{\ln(x)}$$ extends to a continuous function on $$[0,1] \times [0,1]\text{.}$$

#### Exercise9.1.2.

Suppose $$h \colon \R \to \R$$ is continuous and $$g \colon \R \to \R$$ is continuously differentiable and compactly supported. That is, there exists some $$M > 0\text{,}$$ such that $$g(x) = 0$$ whenever $$\abs{x} \geq M\text{.}$$ Define

\begin{equation*} f(x) := \int_{-\infty}^\infty h(y)g(x-y)\,dy . \end{equation*}

Show that $$f$$ is differentiable.

#### Exercise9.1.3.

Suppose $$f \colon \R \to \R$$ is infinitely differentiable (all derivatives exist) such that $$f(0) = 0\text{.}$$ Then show that there exists an infinitely differentiable function $$g \colon \R \to \R$$ such that $$f(x) = x\,g(x)\text{.}$$ Show also that if $$f'(0) \not= 0\text{,}$$ then $$g(0) \not= 0\text{.}$$
Hint: First write $$f(x) = \int_0^x f'(s) \,ds$$ and then rewrite the integral to go from $$0$$ to $$1\text{.}$$

#### Exercise9.1.4.

Compute $$\int_0^1 e^{tx} \,dx\text{.}$$ Derive the formula for $$\int_0^1 x^n e^{x} \,dx$$ not using integration by parts, but by differentiation underneath the integral.

#### Exercise9.1.5.

Let $$U \subset \R^n$$ be an open set and suppose $$f(x,y_1,y_2,\ldots,y_n)$$ is a continuous function defined on $$[0,1] \times U \subset \R^{n+1}\text{.}$$ Suppose $$\frac{\partial f}{\partial y_1}, \frac{\partial f}{\partial y_2},\ldots, \frac{\partial f}{\partial y_n}$$ exist and are continuous on $$[0,1] \times U\text{.}$$ Then prove that $$F \colon U \to \R$$ defined by

\begin{equation*} F(y_1,y_2,\ldots,y_n) := \int_0^1 f(x,y_1,y_2,\ldots,y_n) \, dx \end{equation*}

is continuously differentiable.

#### Exercise9.1.6.

Work out the following counterexample: Let

\begin{equation*} f(x,y) := \begin{cases} \frac{xy^3}{{(x^2+y^2)}^2} & \text{if } x\not=0 \text{ or } y\not= 0, \\ 0 & \text{if } x=0 \text{ and } y=0. \end{cases} \end{equation*}
1. Prove that for every fixed $$y\text{,}$$ the function $$x \mapsto f(x,y)$$ is Riemann integrable on $$[0,1]\text{,}$$ and

\begin{equation*} g(y) := \int_0^1 f(x,y) \, dx = \frac{y}{2y^2+2} . \end{equation*}

Therefore, $$g'(y)$$ exists and its derivative is the continuous function

\begin{equation*} g'(y) = \frac{d}{dy} \int_0^1 f(x,y) \, dx = \frac{1-y^2}{2{(y^2+1)}^2} . \end{equation*}
2. Prove $$\frac{\partial f}{\partial y}$$ exists at all $$x$$ and $$y$$ and compute it.

3. Show that for all $$y$$

\begin{equation*} \int_0^1 \frac{\partial f}{\partial y} (x,y) \, dx \end{equation*}

exists, but

\begin{equation*} g'(0) \not= \int_0^1 \frac{\partial f}{\partial y} (x,0) \, dx . \end{equation*}

#### Exercise9.1.7.

Work out the following counterexample: Let

\begin{equation*} f(x,y) := \begin{cases} x \,\sin \left(\frac{y}{x^2+y^2}\right) & \text{if } (x,y) \not= (0,0),\\ 0 & \text{if } (x,y)=(0,0). \end{cases} \end{equation*}
1. Prove $$f$$ is continuous on all of $$\R^2\text{.}$$ Therefore the following function is well-defined for every $$y \in \R\text{:}$$

\begin{equation*} g(y) := \int_0^1 f(x,y) \, dx . \end{equation*}
2. Prove $$\frac{\partial f}{\partial y}$$ exists for all $$(x,y)\text{,}$$ but is not continuous at $$(0,0)\text{.}$$

3. Show that $$\int_0^1 \frac{\partial f}{\partial y}(x,0) \, dx$$ does not exist even if we take improper integrals, that is, that the limit $$\lim\limits_{h \to 0^+} \int_h^1 \frac{\partial f}{\partial y}(x,0) \, dx$$ does not exist.

Note: Feel free to use what you know about sine and cosine from calculus.

#### Exercise9.1.8.

Strengthen the Leibniz integral rule in the following way. Suppose $$f \colon (a,b) \times (c,d) \to \R$$ is a bounded continuous function, such that $$\frac{\partial f}{\partial y}$$ exists for all $$(x,y) \in (a,b) \times (c,d)$$ and is continuous and bounded. Define

\begin{equation*} g(y) := \int_a^b f(x,y) \,dx . \end{equation*}

Then $$g \colon (c,d) \to \R$$ is continuously differentiable and

\begin{equation*} g'(y) = \int_a^b \frac{\partial f}{\partial y}(x,y) \,dx . \end{equation*}

Hint: See also Exercise 7.5.18 and Theorem 6.2.10.

For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf