Skip to main content

Section 5.5 Improper integrals

Note: 2–3 lectures (optional section, can safely be skipped, requires the optional Section 3.5)

Often it is necessary to integrate over the entire real line, or an unbounded interval of the form \([a,\infty)\) or \((-\infty,b]\text{.}\) We may also wish to integrate unbounded functions defined on a open bounded interval \((a,b)\text{.}\) For such intervals or functions, the Riemann integral is not defined, but we will write down the integral anyway in the spirit of Lemma 5.2.8. These integrals are called improper integrals and are limits of integrals rather than integrals themselves.

Definition 5.5.1.

Suppose \(f \colon [a,b) \to \R\) is a function (not necessarily bounded) that is Riemann integrable on \([a,c]\) for all \(c < b\text{.}\) We define

\begin{equation*} \int_a^b f := \lim_{c \to b^-} \int_a^{c} f \end{equation*}

if the limit exists.

Suppose \(f \colon [a,\infty) \to \R\) is a function such that \(f\) is Riemann integrable on \([a,c]\) for all \(c < \infty\text{.}\) We define

\begin{equation*} \int_a^\infty f := \lim_{c \to \infty} \int_a^c f \end{equation*}

if the limit exists.

If the limit exists, we say the improper integral converges. If the limit does not exist, we say the improper integral diverges.

We similarly define improper integrals for the left-hand endpoint, we leave this to the reader.

For a finite endpoint \(b\text{,}\) if \(f\) is bounded, then Lemma 5.2.8 says that we defined nothing new. What is new is that we can apply this definition to unbounded functions. The following set of examples is so useful that we state it as a proposition.


The proof follows by application of the fundamental theorem of calculus. Let us do the proof for \(p > 1\) for the infinite right endpoint and leave the rest to the reader. Hint: You should handle \(p=1\) separately.

Suppose \(p > 1\text{.}\) Then using the fundamental theorem,

\begin{equation*} \int_1^b \frac{1}{x^p} \,dx = \int_1^b x^{-p} \,dx = \frac{b^{-p+1}}{-p+1} - \frac{1^{-p+1}}{-p+1} = \frac{-1}{(p-1)b^{p-1}} + \frac{1}{p-1} . \end{equation*}

As \(p > 1\text{,}\) then \(p-1 > 0\text{.}\) Take the limit as \(b \to \infty\) to obtain that \(\frac{1}{b^{p-1}}\) goes to 0. The result follows.

We state the following proposition on “tails” for just one type of improper integral, though the proof is straight forward and the same for other types of improper integrals.


Let \(c > b\text{.}\) Then

\begin{equation*} \int_a^c f = \int_a^b f + \int_b^c f . \end{equation*}

Taking the limit \(c \to \infty\) finishes the proof.

Nonnegative functions are easier to work with as the following proposition demonstrates. The exercises will show that this proposition holds only for nonnegative functions. Analogues of this proposition exist for all the other types of improper limits and are left to the student.

In the first item we allow for the value of \(\infty\) in the supremum indicating that the integral diverges to infinity.


We start with the first item. As \(f\) is nonnegative, \(\int_a^x f\) is increasing as a function of \(x\text{.}\) If the supremum is infinite, then for every \(M \in \R\) we find \(N\) such that \(\int_a^N f \geq M\text{.}\) As \(\int_a^x f\) is increasing, \(\int_a^x f \geq M\) for all \(x \geq N\text{.}\) So \(\int_a^\infty f\) diverges to infinity.

Next suppose the supremum is finite, say \(A := \sup \left\{ \int_a^x f : x \geq a \right\}\text{.}\) For every \(\epsilon > 0\text{,}\) we find an \(N\) such that \(A - \int_a^N f < \epsilon\text{.}\) As \(\int_a^x f\) is increasing, then \(A - \int_a^x f < \epsilon\) for all \(x \geq N\) and hence \(\int_a^\infty f\) converges to \(A\text{.}\)

Let us look at the second item. If \(\int_a^\infty f\) converges, then every sequence \(\{ x_n \}\) going to infinity works. The trick is proving the other direction. Suppose \(\{ x_n \}\) is such that \(\lim\, x_n = \infty\) and

\begin{equation*} \lim_{n\to\infty} \int_a^{x_n} f = A \end{equation*}

converges. Given \(\epsilon > 0\text{,}\) pick \(N\) such that for all \(n \geq N\text{,}\) we have \(A - \epsilon < \int_a^{x_n} f < A + \epsilon\text{.}\) Because \(\int_a^x f\) is increasing as a function of \(x\text{,}\) we have that for all \(x \geq x_N\)

\begin{equation*} A - \epsilon < \int_a^{x_N} f \leq \int_a^x f . \end{equation*}

As \(\{ x_n \}\) goes to \(\infty\text{,}\) then for any given \(x\text{,}\) there is an \(x_m\) such that \(m \geq N\) and \(x \leq x_m\text{.}\) Then

\begin{equation*} \int_a^{x} f \leq \int_a^{x_m} f < A + \epsilon . \end{equation*}

In particular, for all \(x \geq x_N\text{,}\) we have \(\abs{\int_a^{x} f - A} < \epsilon\text{.}\)


We start with the first item. For every \(b\) and \(c\text{,}\) such that \(a \leq b \leq c\text{,}\) we have \(-g(x) \leq f(x) \leq g(x)\text{,}\) and so

\begin{equation*} \int_b^c -g \leq \int_b^c f \leq \int_b^c g . \end{equation*}

In other words, \(\abs{\int_b^c f} \leq \int_b^c g\text{.}\)

Let \(\epsilon > 0\) be given. Because of Proposition 5.5.3,

\begin{equation*} \int_a^\infty g = \int_a^b g + \int_b^\infty g . \end{equation*}

As \(\int_a^b g\) goes to \(\int_a^\infty g\) as \(b\) goes to infinity, \(\int_b^\infty g\) goes to 0 as \(b\) goes to infinity. Choose \(B\) such that

\begin{equation*} \int_B^\infty g < \epsilon . \end{equation*}

As \(g\) is nonnegative, if \(B \leq b < c\text{,}\) then \(\int_b^c g < \epsilon\) as well. Let \(\{ x_n \}\) be a sequence going to infinity. Let \(M\) be such that \(x_n \geq B\) for all \(n \geq M\text{.}\) Take \(n, m \geq M\text{,}\) with \(x_n \leq x_m\text{,}\)

\begin{equation*} \abs{\int_a^{x_m} f - \int_a^{x_n} f} = \abs{\int_{x_n}^{x_m} f} \leq \int_{x_n}^{x_m} g < \epsilon . \end{equation*}

Therefore, the sequence \(\{ \int_a^{x_n} f \}_{n=1}^\infty\) is Cauchy and hence converges.

We need to show that the limit is unique. Suppose \(\{ x_n \}\) is a sequence converging to infinity such that \(\{ \int_a^{x_n} f \}\) converges to \(L_1\text{,}\) and \(\{ y_n \}\) is a sequence converging to infinity is such that \(\{ \int_a^{y_n} f \}\) converges to \(L_2\text{.}\) Then there must be some \(n\) such that \(\abs{\int_a^{x_n} f - L_1} < \epsilon\) and \(\abs{\int_a^{y_n} f - L_2} < \epsilon\text{.}\) We can also suppose \(x_n \geq B\) and \(y_n \geq B\text{.}\) Then

\begin{equation*} \abs{L_1 - L_2} \leq \abs{L_1 - \int_a^{x_n} f} + \abs{\int_a^{x_n} f- \int_a^{y_n} f} + \abs{\int_a^{y_n} f - L_2} < \epsilon + \abs{\int_{x_n}^{y_n} f} + \epsilon < 3 \epsilon. \end{equation*}

As \(\epsilon > 0\) was arbitrary, \(L_1 = L_2\text{,}\) and hence \(\int_a^\infty f\) converges. Above we have shown that \(\abs{\int_a^c f} \leq \int_a^c g\) for all \(c > a\text{.}\) By taking the limit \(c \to \infty\text{,}\) the first item is proved.

The second item is simply a contrapositive of the first item.

Example 5.5.6.

The improper integral

\begin{equation*} \int_0^\infty \frac{\sin(x^2)(x+2)}{x^3+1} \,dx \end{equation*}


Proof: Observe we simply need to show that the integral converges when going from 1 to infinity. For \(x \geq 1\) we obtain

\begin{equation*} \abs{\frac{\sin(x^2)(x+2)}{x^3+1}} \leq \frac{x+2}{x^3+1} \leq \frac{x+2}{x^3} \leq \frac{x+2x}{x^3} \leq \frac{3}{x^2} . \end{equation*}


\begin{equation*} \int_1^\infty \frac{3}{x^2}\,dx = 3 \int_1^\infty \frac{1}{x^2}\,dx = 3 . \end{equation*}

So using the comparison test and the tail test, the original integral converges.

Example 5.5.7.

You should be careful when doing formal manipulations with improper integrals. The integral

\begin{equation*} \int_2^\infty \frac{2}{x^2-1}\,dx \end{equation*}

converges via the comparison test using \(\nicefrac{1}{x^2}\) again. However, if you succumb to the temptation to write

\begin{equation*} \frac{2}{x^2-1} = \frac{1}{x-1} - \frac{1}{x+1} \end{equation*}

and try to integrate each part separately, you will not succeed. It is not true that you can split the improper integral in two; you cannot split the limit.

\begin{equation*} \begin{split} \int_2^\infty \frac{2}{x^2-1} \,dx &= \lim_{b\to \infty} \int_2^b \frac{2}{x^2-1} \,dx \\ &= \lim_{b\to \infty} \left( \int_2^b \frac{1}{x-1}\,dx - \int_2^b \frac{1}{x+1}\,dx \right) \\ &\not= \int_2^\infty \frac{1}{x-1}\,dx - \int_2^\infty \frac{1}{x+1}\,dx . \end{split} \end{equation*}

The last line in the computation does not even make sense. Both of the integrals there diverge to infinity, since we can apply the comparison test appropriately with \(\nicefrac{1}{x}\text{.}\) We get \(\infty - \infty\text{.}\)

Now suppose we need to take limits at both endpoints.

Definition 5.5.8.

Suppose \(f \colon (a,b) \to \R\) is a function that is Riemann integrable on \([c,d]\) for all \(c\text{,}\) \(d\) such that \(a < c < d < b\text{,}\) then we define

\begin{equation*} \int_a^b f := \lim_{c \to a^+} \, \lim_{d \to b^-} \, \int_{c}^{d} f \end{equation*}

if the limits exist.

Suppose \(f \colon \R \to \R\) is a function such that \(f\) is Riemann integrable on all bounded intervals \([a,b]\text{.}\) Then we define

\begin{equation*} \int_{-\infty}^\infty f := \lim_{c \to -\infty} \, \lim_{d \to \infty} \, \int_c^d f \end{equation*}

if the limits exist.

We similarly define improper integrals with one infinite and one finite improper endpoint, we leave this to the reader.

One ought to always be careful about double limits. The definition given above says that we first take the limit as \(d\) goes to \(b\) or \(\infty\) for a fixed \(c\text{,}\) and then we take the limit in \(c\text{.}\) We will have to prove that in this case it does not matter which limit we compute first.

Example 5.5.9.

Let us see an example:

\begin{equation*} \int_{-\infty}^\infty \frac{1}{1+x^2} \, dx = \lim_{a \to -\infty} \, \lim_{b \to \infty} \, \int_{a}^b \frac{1}{1+x^2} \, dx = \lim_{a \to -\infty} \, \lim_{b \to \infty} \bigl( \arctan(b) - \arctan(a) \bigr) = \pi . \end{equation*}

In the definition the order of the limits can always be switched if they exist. Let us prove this fact only for the infinite limits.


Without loss of generality, assume \(a < 0\) and \(b > 0\text{.}\) Suppose the first expression converges. Then

\begin{equation*} \begin{split} \lim_{a \to -\infty} \, \lim_{b \to \infty} \, \int_a^b f & = \lim_{a \to -\infty} \, \lim_{b \to \infty} \left( \int_a^0 f + \int_0^b f \right) = \left( \lim_{a \to -\infty} \int_a^0 f \right) + \left( \lim_{b \to \infty} \int_0^b f \right) \\ & = \lim_{b \to \infty} \left( \left( \lim_{a \to -\infty} \int_a^0 f \right) + \int_0^b f \right) = \lim_{b \to \infty} \, \lim_{a \to -\infty} \left( \int_a^0 f + \int_0^b f \right) . \end{split} \end{equation*}

Similar computation shows the other direction. Therefore, if either expression converges, then the improper integral converges and

\begin{equation*} \begin{split} \int_{-\infty}^\infty f = \lim_{a \to -\infty} \, \lim_{b \to \infty} \, \int_a^b f & = \left( \lim_{a \to -\infty} \int_a^0 f \right) + \left( \lim_{b \to \infty} \int_0^b f \right) \\ & = \left( \lim_{a \to \infty} \int_{-a}^0 f \right) + \left( \lim_{a \to \infty} \int_0^a f \right) = \lim_{a \to \infty} \left( \int_{-a}^0 f + \int_0^a f \right) = \lim_{a \to \infty} \int_{-a}^a f . \end{split} \end{equation*}

Example 5.5.11.

On the other hand, you must be careful to take the limits independently before you know convergence. Let \(f(x) = \frac{x}{\abs{x}}\) for \(x \not= 0\) and \(f(0) = 0\text{.}\) If \(a < 0\) and \(b > 0\text{,}\) then

\begin{equation*} \int_{a}^b f = \int_{a}^0 f + \int_{0}^b f = a+b . \end{equation*}

For every fixed \(a < 0\text{,}\) the limit as \(b \to \infty\) is infinite. So even the first limit does not exist, and the improper integral \(\int_{-\infty}^\infty f\) does not converge. On the other hand, if \(a > 0\text{,}\) then

\begin{equation*} \int_{-a}^{a} f = (-a)+a = 0 . \end{equation*}


\begin{equation*} \lim_{a\to\infty} \int_{-a}^{a} f = 0 . \end{equation*}

Example 5.5.12.

An example to keep in mind for improper integrals is the so-called sinc function 1 . This function comes up quite often in both pure and applied mathematics. Define

\begin{equation*} \operatorname{sinc}(x) = \begin{cases} \frac{\sin(x)}{x} & \text{if } x \not= 0 , \\ 1 & \text{if } x = 0 . \end{cases} \end{equation*}

Figure 5.6. The sinc function.

It is not difficult to show that the sinc function is continuous at zero, but that is not important right now. What is important is that

\begin{equation*} \int_{-\infty}^\infty \operatorname{sinc}(x) \,dx = \pi , \qquad \text{while} \qquad \int_{-\infty}^\infty \abs{\operatorname{sinc}(x)} \,dx = \infty . \end{equation*}

The integral of the sinc function is a continuous analogue of the alternating harmonic series \(\sum \nicefrac{{(-1)}^n}{n}\text{,}\) while the absolute value is like the regular harmonic series \(\sum \nicefrac{1}{n}\text{.}\) In particular, the fact that the integral converges must be done directly rather than using comparison test.

We will not prove the first statement exactly. Let us simply prove that the integral of the sinc function converges, but we will not worry about the exact limit. Because \(\frac{\sin(-x)}{-x} = \frac{\sin(x)}{x}\text{,}\) it is enough to show that

\begin{equation*} \int_{2\pi}^\infty \frac{\sin(x)}{x}\,dx \end{equation*}

converges. We also avoid \(x=0\) this way to make our life simpler.

For every \(n \in \N\text{,}\) we have that for \(x \in [\pi 2n, \pi (2n+1)]\text{,}\)

\begin{equation*} \frac{\sin(x)}{\pi (2n+1)} \leq \frac{\sin(x)}{x} \leq \frac{\sin(x)}{\pi 2n} , \end{equation*}

as \(\sin(x) \geq 0\text{.}\) For \(x \in [\pi (2n+1), \pi (2n+2)]\text{,}\)

\begin{equation*} \frac{\sin(x)}{\pi (2n+1)} \leq \frac{\sin(x)}{x} \leq \frac{\sin(x)}{\pi (2n+2)} , \end{equation*}

as \(\sin(x) \leq 0\text{.}\)

Via the fundamental theorem of calculus,

\begin{equation*} \frac{2}{\pi (2n+1)} = \int_{\pi 2n}^{\pi (2n+1)} \frac{\sin(x)}{\pi (2n+1)} \,dx \leq \int_{\pi 2n}^{\pi (2n+1)} \frac{\sin(x)}{x} \,dx \leq \int_{\pi 2n}^{\pi (2n+1)} \frac{\sin(x)}{\pi 2n} \,dx = \frac{1}{\pi n} . \end{equation*}


\begin{equation*} \frac{-2}{\pi (2n+1)} \leq \int_{\pi (2n+1)}^{\pi (2n+2)} \frac{\sin(x)}{x} \,dx \leq \frac{-1}{\pi (n+1)} . \end{equation*}

Adding the two together we find

\begin{equation*} 0 = \frac{2}{\pi (2n+1)} + \frac{-2}{\pi (2n+1)} \leq \int_{2\pi n}^{2\pi (n+1)} \frac{\sin(x)}{x} \,dx \leq \frac{1}{\pi n} + \frac{-1}{\pi (n+1)} = \frac{1}{\pi n(n+1)} . \end{equation*}

See Figure 5.7.

Figure 5.7. Bound of \(\int_{2\pi n}^{2\pi (n+1)} \frac{\sin(x)}{x} \,dx\) using the shaded integral (signed area \(\frac{1}{\pi n} + \frac{-1}{\pi (n+1)}\)).

For \(k \in \N\text{,}\)

\begin{equation*} \int_{2\pi}^{2k\pi} \frac{\sin(x)}{x} \,dx = \sum_{n=1}^{k-1} \int_{2\pi n}^{2\pi (n+1)} \frac{\sin(x)}{x} \,dx \leq \sum_{n=1}^{k-1} \frac{1}{\pi n(n+1)} . \end{equation*}

We find the partial sums of a series with positive terms. The series converges as \(\sum \frac{1}{\pi n (n+1)}\) is a convergent series. Thus as a sequence,

\begin{equation*} \lim_{k\to \infty} \int_{2\pi}^{2k\pi} \frac{\sin(x)}{x} \,dx =L \leq \sum_{n=1}^{\infty} \frac{1}{\pi n(n+1)} < \infty . \end{equation*}

Let \(M > 2\pi\) be arbitrary, and let \(k \in \N\) be the largest integer such that \(2k\pi \leq M\text{.}\) For \(x \in [2k\pi,M]\text{,}\) we have \(\frac{-1}{2k\pi} \leq \frac{\sin(x)}{x} \leq \frac{1}{2k\pi}\text{,}\) and so

\begin{equation*} \abs{\int_{2k\pi}^{M} \frac{\sin(x)}{x} \,dx } \leq \frac{M-2k\pi}{2k\pi} \leq \frac{1}{k} . \end{equation*}

As \(k\) is the largest \(k\) such that \(2k\pi \leq M\text{,}\) then as \(M\in \R\) goes to infinity, so does \(k \in \N\text{.}\)


\begin{equation*} \int_{2\pi}^M \frac{\sin(x)}{x}\,dx = \int_{2\pi}^{2k\pi} \frac{\sin(x)}{x} \,dx + \int_{2k\pi}^{M} \frac{\sin(x)}{x} \,dx . \end{equation*}

As \(M\) goes to infinity, the first term on the right-hand side goes to \(L\text{,}\) and the second term on the right-hand side goes to zero. Hence

\begin{equation*} \int_{2\pi}^\infty \frac{\sin(x)}{x} \,dx = L . \end{equation*}

The double-sided integral of sinc also exists as noted above. We leave the other statement—that the integral of the absolute value of the sinc function diverges—as an exercise.

Subsection 5.5.1 Integral test for series

The fundamental theorem of calculus can be used in proving a series is summable and to estimate its sum.

See Figure 5.8, for an illustration with \(k=1\text{.}\) By Proposition 5.2.11, \(f\) is integrable on every interval \([k,b]\) for all \(b > k\text{,}\) so the statement of the theorem makes sense without additional hypotheses of integrability.

Figure 5.8. The area under the curve, \(\int_1^\infty f\text{,}\) is bounded below by the area of the shaded rectangles, \(f(2)+f(3)+f(4)+\cdots\text{,}\) and bounded above by the area entire rectangles, \(f(1)+f(2)+f(3)+\cdots\text{.}\)


Let \(\ell, m \in \Z\) be such that \(m > \ell \geq k\text{.}\) Because \(f\) is decreasing, we have \(\int_{n}^{n+1} f \leq f(n) \leq \int_{n-1}^{n} f\text{.}\) Therefore,

\begin{equation} \int_\ell^m f = \sum_{n=\ell}^{m-1} \int_{n}^{n+1} f \leq \sum_{n=\ell}^{m-1} f(n) \leq f(\ell) + \sum_{n=\ell+1}^{m-1} \int_{n-1}^{n} f \leq f(\ell)+ \int_\ell^{m-1} f .\tag{5.3} \end{equation}

Suppose first that \(\int_k^\infty f\) converges and let \(\epsilon > 0\) be given. As before, since \(f\) is positive, then there exists an \(L \in \N\) such that if \(\ell \geq L\text{,}\) then \(\int_\ell^{m} f < \nicefrac{\epsilon}{2}\) for all \(m \geq \ell\text{.}\) The function \(f\) must decrease to zero (why?), so make \(L\) large enough so that for \(\ell \geq L\text{,}\) we have \(f(\ell) < \nicefrac{\epsilon}{2}\text{.}\) Thus, for \(m > \ell \geq L\text{,}\) we have via (5.3),

\begin{equation*} \sum_{n=\ell}^{m} f(n) \leq f(\ell)+ \int_\ell^{m} f < \nicefrac{\epsilon}{2} + \nicefrac{\epsilon}{2} = \epsilon . \end{equation*}

The series is therefore Cauchy and thus converges. The estimate in the proposition is obtained by letting \(m\) go to infinity in (5.3) with \(\ell = k\text{.}\)

Conversely, suppose \(\int_k^\infty f\) diverges. As \(f\) is positive, then by Proposition 5.5.4, the sequence \(\{ \int_k^m f \}_{m=k}^\infty\) diverges to infinity. Using (5.3) with \(\ell = k\text{,}\) we find

\begin{equation*} \int_k^m f \leq \sum_{n=k}^{m-1} f(n) . \end{equation*}

As the left-hand side goes to infinity as \(m \to \infty\text{,}\) so does the right-hand side.

Example 5.5.14.

The integral test can be used not only to show that a series converges, but to estimate its sum to arbitrary precision. Let us show \(\sum_{n=1}^\infty \frac{1}{n^2}\) exists and estimate its sum to within 0.01. As this series is the \(p\)-series for \(p=2\text{,}\) we already proved it converges (let us pretend we do not know that), but we only roughly estimated its sum.

The fundamental theorem of calculus says that for all \(k \in \N\text{,}\)

\begin{equation*} \int_{k}^\infty \frac{1}{x^2}\,dx = \frac{1}{k} . \end{equation*}

In particular, the series must converge. But we also have

\begin{equation*} \frac{1}{k} = \int_k^\infty \frac{1}{x^2}\,dx \leq \sum_{n=k}^\infty \frac{1}{n^2} \leq \frac{1}{k^2} + \int_k^\infty \frac{1}{x^2}\,dx = \frac{1}{k^2} + \frac{1}{k} . \end{equation*}

Adding the partial sum up to \(k-1\) we get

\begin{equation*} \frac{1}{k} + \sum_{n=1}^{k-1} \frac{1}{n^2} \leq \sum_{n=1}^\infty \frac{1}{n^2} \leq \frac{1}{k^2} + \frac{1}{k} + \sum_{n=1}^{k-1} \frac{1}{n^2} . \end{equation*}

In other words, \(\nicefrac{1}{k} + \sum_{n=1}^{k-1} \nicefrac{1}{n^2}\) is an estimate for the sum to within \(\nicefrac{1}{k^2}\text{.}\) Therefore, if we wish to find the sum to within 0.01, we note \(\nicefrac{1}{{10}^2} = 0.01\text{.}\) We obtain

\begin{equation*} 1.6397\ldots \approx \frac{1}{10} + \sum_{n=1}^{9} \frac{1}{n^2} \leq \sum_{n=1}^\infty \frac{1}{n^2} \leq \frac{1}{100} + \frac{1}{10} + \sum_{n=1}^{9} \frac{1}{n^2} \approx 1.6497\ldots . \end{equation*}

The actual sum is \(\nicefrac{\pi^2}{6} \approx 1.6449\ldots\text{.}\)

Subsection 5.5.2 Exercises

Exercise 5.5.2.

Find out for which \(a \in \R\) does \(\sum\limits_{n=1}^\infty e^{an}\) converge. When the series converges, find an upper bound for the sum.

Exercise 5.5.3.

  1. Estimate \(\sum\limits_{n=1}^\infty \frac{1}{n(n+1)}\) correct to within 0.01 using the integral test.

  2. Compute the limit of the series exactly and compare. Hint: The sum telescopes.

Exercise 5.5.4.


\begin{equation*} \int_{-\infty}^\infty \abs{\operatorname{sinc}(x)}\,dx = \infty . \end{equation*}

Hint: Again, it is enough to show this on just one side.

Exercise 5.5.5.

Can you interpret

\begin{equation*} \int_{-1}^1 \frac{1}{\sqrt{\abs{x}}}\,dx \end{equation*}

as an improper integral? If so, compute its value.

Exercise 5.5.6.

Take \(f \colon [0,\infty) \to \R\text{,}\) Riemann integrable on every interval \([0,b]\text{,}\) and such that there exist \(M\text{,}\) \(a\text{,}\) and \(T\text{,}\) such that \(\abs{f(t)} \leq M e^{at}\) for all \(t \geq T\text{.}\) Show that the Laplace transform of \(f\) exists. That is, for every \(s > a\) the following integral converges:

\begin{equation*} F(s) := \int_{0}^\infty f(t) e^{-st} \,dt . \end{equation*}

Exercise 5.5.7.

Let \(f \colon \R \to \R\) be a Riemann integrable function on every interval \([a,b]\text{,}\) and such that \(\int_{-\infty}^\infty \abs{f(x)}\,dx < \infty\text{.}\) Show that the Fourier sine and cosine transforms exist. That is, for every \(\omega \geq 0\) the following integrals converge

\begin{equation*} F^s(\omega) := \frac{1}{\pi} \int_{-\infty}^\infty f(t) \sin(\omega t) \,dt , \qquad F^c(\omega) := \frac{1}{\pi} \int_{-\infty}^\infty f(t) \cos(\omega t) \,dt . \end{equation*}

Furthermore, show that \(F^s\) and \(F^c\) are bounded functions.

Exercise 5.5.8.

Suppose \(f \colon [0,\infty) \to \R\) is Riemann integrable on every interval \([0,b]\text{.}\) Show that \(\int_0^\infty f\) converges if and only if for every \(\epsilon > 0\) there exists an \(M\) such that if \(M \leq a < b\text{,}\) then \(\babs{\int_a^b f} < \epsilon\text{.}\)

Exercise 5.5.9.

Suppose \(f \colon [0,\infty) \to \R\) is nonnegative and decreasing. Prove:

  1. If \(\int_0^\infty f < \infty\text{,}\) then \(\lim\limits_{x\to\infty} f(x) = 0\text{.}\)

  2. The converse does not hold.

Exercise 5.5.10.

Find an example of an unbounded continuous function \(f \colon [0,\infty) \to \R\) that is nonnegative and such that \(\int_0^\infty f < \infty\text{.}\) Note that \(\lim_{x\to\infty} f(x)\) will not exist; compare previous exercise. Hint: On each interval \([k,k+1]\text{,}\) \(k \in \N\text{,}\) define a function whose integral over this interval is less than say \(2^{-k}\text{.}\)

Exercise 5.5.11.

(More challenging)   Find an example of a function \(f \colon [0,\infty) \to \R\) integrable on all intervals such that \(\lim_{n\to\infty} \int_0^n f\) converges as a limit of a sequence (so \(n \in \N\)), but such that \(\int_0^\infty f\) does not exist. Hint: For all \(n\in \N\text{,}\) divide \([n,n+1]\) into two halves. On one half make the function negative, on the other make the function positive.

Exercise 5.5.12.

Suppose \(f \colon [1,\infty) \to \R\) is such that \(g(x) := x^2 f(x)\) is a bounded function. Prove that \(\int_1^\infty f\) converges.

It is sometimes desirable to assign a value to integrals that normally cannot be interpreted even as improper integrals, e.g. \(\int_{-1}^1 \nicefrac{1}{x}\,dx\text{.}\) Suppose \(f \colon [a,b] \to \R\) is a function and \(a < c < b\text{,}\) where \(f\) is Riemann integrable on the intervals \([a,c-\epsilon]\) and \([c+\epsilon,b]\) for all \(\epsilon > 0\text{.}\) Define the Cauchy principal value of \(\int_a^b f\) as

\begin{equation*} p.v.\!\int_a^b f := \lim_{\epsilon\to 0^+} \left( \int_a^{c-\epsilon} f + \int_{c+\epsilon}^b f \right) , \end{equation*}

if the limit exists.

Exercise 5.5.13.

  1. Compute \(p.v.\!\int_{-1}^1 \nicefrac{1}{x}\,dx\text{.}\)

  2. Compute \(\lim_{\epsilon\to 0^+} ( \int_{-1}^{-\epsilon} \nicefrac{1}{x}\,dx + \int_{2\epsilon}^1 \nicefrac{1}{x}\,dx )\) and show it is not equal to the principal value.

  3. Show that if \(f\) is integrable on \([a,b]\text{,}\) then \(p.v.\!\int_a^b f = \int_a^b f\) (for an arbitrary \(c \in (a,b)\)).

  4. Suppose \(f \colon [-1,1] \to \R\) is an odd function (\(f(-x)=-f(x)\)) that is integrable on \([-1,-\epsilon]\) and \([\epsilon,1]\) for all \(\epsilon >0\text{.}\) Prove that \(p.v.\!\int_{-1}^1 f = 0\)

  5. Suppose \(f \colon [-1,1] \to \R\) is continuous and differentiable at 0. Show that \(p.v.\!\int_{-1}^1 \frac{f(x)}{x}\,dx\) exists.

Exercise 5.5.14.

Let \(f \colon \R \to \R\) and \(g \colon \R \to \R\) be continuous functions, where \(g(x) = 0\) for all \(x \notin [a,b]\) for some interval \([a,b]\text{.}\)

  1. Show that the convolution

    \begin{equation*} (g * f)(x) := \int_{-\infty}^\infty f(t)g(x-t)\,dt \end{equation*}

    is well-defined for all \(x \in \R\text{.}\)

  2. Suppose \(\int_{-\infty}^\infty \abs{f(x)}\,dx < \infty\text{.}\) Prove that

    \begin{equation*} \lim_{x \to -\infty} (g * f)(x) = 0, \qquad \text{and} \qquad \lim_{x \to \infty} (g * f)(x) = 0 . \end{equation*}
Shortened from Latin: sinus cardinalis
For a higher quality printout use the PDF versions: or