## Section8.6Higher order derivatives

Note: less than 1 lecture, partly depends on the optional Section 4.3

Let $$U \subset \R^n$$ be an open set and $$f \colon U \to \R$$ a function. Denote by $$x = (x_1,x_2,\ldots,x_n) \in \R^n$$ our coordinates. Suppose $$\frac{\partial f}{\partial x_j}$$ exists everywhere in $$U\text{,}$$ then it is also a function $$\frac{\partial f}{\partial x_j} \colon U \to \R\text{.}$$ Therefore, it makes sense to talk about its partial derivatives. We denote the partial derivative of $$\frac{\partial f}{\partial x_j}$$ with respect to $$x_k$$ by

\begin{equation*} \frac{\partial^2 f}{\partial x_k \partial x_j} := \frac{\partial \bigl( \frac{\partial f}{\partial x_j} \bigr)}{\partial x_k} . \end{equation*}

If $$k=j\text{,}$$ then we write $$\frac{\partial^2 f}{\partial x_j^2}$$ for simplicity.

We define higher order derivatives inductively. Suppose $$j_1,j_2,\ldots,j_\ell$$ are integers between $$1$$ and $$n\text{,}$$ and suppose

\begin{equation*} \frac{\partial^{\ell-1} f}{\partial x_{j_{\ell-1}} \partial x_{j_{\ell-2}} \cdots \partial x_{j_1}} \end{equation*}

exists and is differentiable in the variable $$x_{j_{\ell}}\text{,}$$ then the partial derivative with respect to that variable is denoted by

\begin{equation*} \frac{\partial^{\ell} f}{\partial x_{j_{\ell}} \partial x_{j_{\ell-1}} \cdots \partial x_{j_1}} := \frac{\partial \bigl( \frac{\partial^{\ell-1} f}{\partial x_{j_{\ell-1}} \partial x_{j_{\ell-2}} \cdots \partial x_{j_1}} \bigr)}{\partial x_{j_{\ell}}} . \end{equation*}

Such a derivative is called a partial derivative of order $$\ell$$.

Sometimes the notation $$f_{x_j x_k}$$ is used for $$\frac{\partial^2 f}{\partial x_k \partial x_j}\text{.}$$ This notation swaps the order in which we write the derivatives, which may be important.

### Definition8.6.1.

Suppose $$U \subset \R^n$$ is an open set and $$f \colon U \to \R$$ is a function. We say $$f$$ is $$k$$-times continuously differentiable function, or a $$C^k$$ function, if all partial derivatives of all orders up to and including order $$k$$ exist and are continuous.

So a continuously differentiable, or $$C^1\text{,}$$ function is one where all partial derivatives exist and are continuous, which agrees with our previous definition due to Proposition 8.4.6. We could have required only that the $$k$$th order partial derivatives exist and are continuous, as the existence of lower order derivatives is clearly necessary to even define $$k$$th order partial derivatives, and these lower order derivatives are continuous as they are differentiable functions.

When the partial derivatives are continuous, we can swap their order.

### Proof.

Fix a $$p \in U\text{,}$$ and let $$e_j$$ and $$e_k$$ be the standard basis vectors. Pick two positive numbers $$s$$ and $$t$$ small enough so that $$p+s_0e_j +t_0e_k \in U$$ whenever $$0 < s_0 \leq s$$ and $$0 < t_0 \leq t\text{.}$$ This can be done as $$U$$ is open and so contains a small open ball (or a box if you wish) around $$p\text{.}$$

Use the mean value theorem on the function

\begin{equation*} \tau \mapsto f(p+se_j + \tau e_k)-f(x + \tau e_k) , \end{equation*}

on the interval $$[0,t]$$ to find a $$t_0 \in (0,t)$$ such that

\begin{equation*} \frac{f(p+se_j + te_k)- f(p+t e_k) - f(p+s e_j)+f(p)}{t} = \frac{\partial f}{\partial x_k}(p + s e_j + t_0 e_k) - \frac{\partial f}{\partial x_k}(p + t_0 e_k) . \end{equation*}

Next there exists a number $$s_0 \in (0,s)$$

\begin{equation*} \frac{\frac{\partial f}{\partial x_k}(p + s e_j + t_0 e_k) - \frac{\partial f}{\partial x_k}(p + t_0 e_k)}{s} = \frac{\partial^2 f}{\partial x_j \partial x_k}(p + s_0 e_j + t_0 e_k) . \end{equation*}

In other words,

\begin{equation*} g(s,t) := \frac{f(p+se_j + te_k)- f(p+t e_k) - f(p+s e_j)+f(p)}{st} = \frac{\partial^2 f}{\partial x_j \partial x_k}(p + s_0 e_j + t_0 e_k) . \end{equation*} Figure 8.13. Using the mean value theorem to estimate a second order partial derivative by a certain difference quotient.

See Figure 8.13. The $$s_0$$ and $$t_0$$ depend on $$s$$ and $$t\text{,}$$ but $$0 < s_0 < s$$ and $$0 < t_0 < t\text{.}$$ Denote by $$\R_+^2$$ the set of $$(s,t)$$ where $$s > 0$$ and $$t > 0\text{.}$$ The set $$\R_+^2$$ is the domain of $$g\text{,}$$ and $$(0,0)$$ is a cluster point of $$\R_+^2\text{.}$$ As $$(s,t) \in \R_+^2$$ goes to $$(0,0)\text{,}$$ $$(s_0,t_0) \in \R_+^2$$ also goes to $$(0,0)\text{.}$$ By continuity of the second partial derivatives,

\begin{equation*} \lim_{(s,t) \to (0,0)} g(s,t) = \frac{\partial^2 f}{\partial x_j \partial x_k}(p) . \end{equation*}

Now reverse the ordering. Start with the function $$\sigma \mapsto f(p+\sigma e_j + te_k)-f(p + \sigma e_j)$$ find an $$s_1 \in (0,s)$$ such that

\begin{equation*} \frac{f(p+te_k + se_j)- f(p+s e_j) - f(p+t e_k)+f(p)}{s} = \frac{\partial f}{\partial x_j}(p + t e_k + s_1 e_j) - \frac{\partial f}{\partial x_j}(p + s_1 e_j) . \end{equation*}

Find a $$t_1 \in (0,t)$$ such that

\begin{equation*} \frac{\frac{\partial f}{\partial x_j}(p + t e_k + s_1 e_j) - \frac{\partial f}{\partial x_j}(p + s_1 e_j)}{t} = \frac{\partial^2 f}{\partial x_k \partial x_j}(p + t_1 e_k + s_1 e_j) . \end{equation*}

So $$g(s,t) = \frac{\partial^2 f}{\partial x_k \partial x_j}(p + t_1 e_k + s_1 e_j)$$ for the same $$g$$ as above. And as before

\begin{equation*} \lim_{(s,t) \to (0,0)} g(s,t) = \frac{\partial^2 f}{\partial x_k \partial x_j}(p) . \end{equation*}

Therefore the two partial derivatives are equal.

The proposition does not hold if the derivatives are not continuous. See the Exercise 8.6.2. Notice also that we did not really need a $$C^2$$ function, we only needed the two second order partial derivatives involved to be continuous functions.

### Subsection8.6.1Exercises

#### Exercise8.6.1.

Suppose $$f \colon U \to \R$$ is a $$C^2$$ function for some open $$U \subset \R^n$$ and $$p \in U\text{.}$$ Use the proof of Proposition 8.6.2 to find an expression in terms of just the values of $$f$$ (analogue of the difference quotient for the first derivative), whose limit is $$\frac{\partial^2 f}{ \partial x_j \partial x_k}(p)\text{.}$$

#### Exercise8.6.2.

Define

\begin{equation*} f(x,y) := \begin{cases} \frac{xy(x^2-y^2)}{x^2+y^2} & \text{if } (x,y) \not= (0,0),\\ 0 & \text{if } (x,y) = (0,0). \end{cases} \end{equation*}

Show that

1. The first order partial derivatives exist and are continuous.

2. The partial derivatives $$\frac{\partial^2 f}{\partial x \partial y}$$ and $$\frac{\partial^2 f}{\partial y \partial x}$$ exist, but are not continuous at the origin, and $$\frac{\partial^2 f}{\partial x \partial y}(0,0) \not= \frac{\partial^2 f}{\partial y \partial x}(0,0)\text{.}$$

#### Exercise8.6.3.

Suppose $$f \colon U \to \R$$ is a $$C^k$$ function for some open $$U \subset \R^n$$ and $$p \in U\text{.}$$ Suppose $$j_1,j_2,\ldots,j_k$$ are integers between $$1$$ and $$n\text{,}$$ and suppose $$\sigma=(\sigma_1,\sigma_2,\ldots,\sigma_k)$$ is a permutation of $$(1,2,\ldots,k)\text{.}$$ Prove

\begin{equation*} \frac{\partial^{k} f}{\partial x_{j_{k}} \partial x_{j_{k-1}} \cdots \partial x_{j_1}} (p) = \frac{\partial^{k} f}{\partial x_{j_{\sigma_k}} \partial x_{j_{\sigma_{k-1}}} \cdots \partial x_{j_{\sigma_1}}} (p) . \end{equation*}

#### Exercise8.6.4.

Suppose $$\varphi \colon \R^2 \to \R$$ is a $$C^k$$ function such that $$\varphi(0,\theta) = \varphi(0,\psi)$$ for all $$\theta,\psi \in \R$$ and $$\varphi(r,\theta) = \varphi(r,\theta+2\pi)$$ for all $$r,\theta \in \R\text{.}$$ Let $$F(r,\theta) := \bigl(r \cos(\theta), r \sin(\theta) \bigr)$$ from Exercise 8.5.8. Show that a function $$g \colon \R^2 \to \R\text{,}$$ given $$g(x,y) := \varphi \bigl(F^{-1}(x,y)\bigr)$$ is well-defined (notice that $$F^{-1}(x,y)$$ can only be defined locally), and when restricted to $$\R^2 \setminus \{ 0 \}$$ it is a $$C^k$$ function.
Note: Feel free to use what you know about sine and cosine from calculus.

#### Exercise8.6.5.

Suppose $$f \colon \R^2 \to \R$$ is a $$C^2$$ function. For all $$(x,y) \in \R^2\text{,}$$ compute

\begin{equation*} \lim_{t \to 0} \frac{f(x+t,y)+f(x-t,y)+f(x,y+t)+f(x,y-t) - 4f(x,y)}{t^2} \end{equation*}

in terms of the partial derivatives of $$f\text{.}$$

#### Exercise8.6.6.

Suppose $$f \colon \R^2 \to \R$$ is a function such that all first and second order partial derivatives exist. Furthermore, suppose that all second order partial derivatives are bounded functions. Prove that $$f$$ is continuously differentiable.

#### Exercise8.6.7.

Follow the strategy below to prove the following simple version of the second derivative test for functions defined on $$\R^2$$ (using $$(x,y)$$ as coordinates): Suppose $$f \colon \R^2 \to \R$$ is a twice continuously differentiable function with a critical point at the origin, $$f'(0,0) = 0\text{.}$$ If

\begin{equation*} \frac{\partial^2 f}{\partial x^2} (0,0) > 0 \qquad \text{and} \qquad \frac{\partial^2 f}{\partial x^2} (0,0) \frac{\partial^2 f}{\partial y^2} (0,0) - {\left(\frac{\partial^2 f}{\partial x \partial y} (0,0) \right)}^2 > 0 , \end{equation*}
then $$f$$ has a (strict) local minimum at $$(0,0)\text{.}$$ Use the following technique: First suppose without loss of generality that $$f(0,0) = 0\text{.}$$ Then prove:

1. There exists an $$A \in L(\R^2)$$ such that $$g = f \circ A$$ is such that $$\frac{\partial^2 g}{\partial x \partial y} (0,0) = 0\text{,}$$ and $$\frac{\partial^2 g}{\partial x^2} (0,0) = \frac{\partial^2 g}{\partial y^2} (0,0) = 1\text{.}$$

2. For every $$\epsilon > 0\text{,}$$ there exists a $$\delta > 0$$ such that $$\abs{g(x,y) - x^2 - y^2} < \epsilon (x^2+y^2)$$ for all $$(x,y) \in B\bigl((0,0),\delta\bigr)\text{.}$$
Hint: You can use Taylor's theorem in one variable.

3. This means that $$g\text{,}$$ and therefore $$f\text{,}$$ has a strict local minimum at $$(0,0)\text{.}$$

Note: You must avoid the temptation to just apply the one variable second derivative test along lines through the origin, see Exercise 8.3.11.

For a higher quality printout use the PDF versions: https://www.jirka.org/ra/realanal.pdf or https://www.jirka.org/ra/realanal2.pdf