###### Theorem2.1.1Superposition

Suppose \(y_1\) and \(y_2\) are two solutions of the homogeneous equation (2). Then

also solves (2) for arbitrary constants \(C_1\) and \(C_2\text{.}\)

\(\require{cancel}\newcommand{\nicefrac}[2]{{{}^{#1}}\!/\!{{}_{#2}}}
\newcommand{\unitfrac}[3][\!\!]{#1 \,\, {{}^{#2}}\!/\!{{}_{#3}}}
\newcommand{\unit}[2][\!\!]{#1 \,\, #2}
\newcommand{\noalign}[1]{}
\newcommand{\qed}{\qquad \Box}
\newcommand{\lt}{<}
\newcommand{\gt}{>}
\newcommand{\amp}{&}
\)

*less than 1 lecture, first part of §3.1 in [EP], parts of §3.1 and §3.2 in [BD]*

Let us consider the general *second order linear differential equation*

\begin{equation*}
A(x) y'' + B(x)y' + C(x)y = F(x) .
\end{equation*}

We usually divide through by \(A(x)\) to get

\begin{equation}
y'' + p(x)y' + q(x)y = f(x) ,\label{sol_eqlin}\tag{1}
\end{equation}

where \(p(x) = \nicefrac{B(x)}{A(x)}\text{,}\) \(q(x) = \nicefrac{C(x)}{A(x)}\text{,}\) and \(f(x) = \nicefrac{F(x)}{A(x)}\text{.}\) The word *linear* means that the equation contains no powers nor functions of \(y\text{,}\) \(y'\text{,}\) and \(y''\text{.}\)

In the special case when \(f(x) = 0\text{,}\) we have a so-called *homogeneous* equation

\begin{equation}
y'' + p(x)y' + q(x)y = 0 .\label{sol_eqlinhom}\tag{2}
\end{equation}

We have already seen some second order linear homogeneous equations.

\begin{equation*}
\begin{aligned}
\qquad y'' + k^2 y & = 0 &
& \text{Two solutions are:} \quad y_1 = \cos (kx), \quad y_2 = \sin(kx) . \qquad \\
\qquad y'' - k^2 y & = 0 &
& \text{Two solutions are:} \quad y_1 = e^{kx}, \quad y_2 = e^{-kx} . \qquad
\end{aligned}
\end{equation*}

If we know two solutions of a linear homogeneous equation, we know a lot more of them.

Suppose \(y_1\) and \(y_2\) are two solutions of the homogeneous equation (2). Then

\begin{equation*}
y(x) = C_1 y_1(x) + C_2 y_2(x) ,
\end{equation*}

also solves (2) for arbitrary constants \(C_1\) and \(C_2\text{.}\)

That is, we can add solutions together and multiply them by constants to obtain new and different solutions. We call the expression \(C_1 y_1 + C_2 y_2\) a *linear combination* of \(y_1\) and \(y_2\text{.}\) Let us prove this theorem; the proof is very enlightening and illustrates how linear equations work.

Proof: Let \(y = C_1 y_1 + C_2 y_2\text{.}\) Then

\begin{equation*}
\begin{split}
y'' + py' + qy & =
(C_1 y_1 + C_2 y_2)'' + p(C_1 y_1 + C_2 y_2)' + q(C_1 y_1 + C_2 y_2) \\
& = C_1 y_1'' + C_2 y_2'' + C_1 p y_1' + C_2 p y_2' + C_1 q y_1 + C_2 q y_2 \\
& = C_1 ( y_1'' + p y_1' + q y_1 ) + C_2 ( y_2'' + p y_2' + q y_2 ) \\
& = C_1 \cdot 0 + C_2 \cdot 0 = 0 . \qed
\end{split}
\end{equation*}

The proof becomes even simpler to state if we use the operator notation. An *operator* is an object that eats functions and spits out functions (kind of like what a function is, but a function eats numbers and spits out numbers). Define the operator \(L\) by

\begin{equation*}
Ly = y'' + py' + qy .
\end{equation*}

The differential equation now becomes \(Ly=0\text{.}\) The operator (and the equation) \(L\) being *linear* means that \(L(C_1y_1 + C_2y_2) =
C_1 Ly_1 + C_2 Ly_2\text{.}\) The proof above becomes

\begin{equation*}
Ly = L(C_1y_1 + C_2y_2) =
C_1 Ly_1 + C_2 Ly_2 = C_1 \cdot 0 + C_2 \cdot 0 = 0 .
\end{equation*}

Two different solutions to the second equation \(y'' - k^2y = 0\) are \(y_1 = \cosh (kx)\) and \(y_2 = \sinh (kx)\text{.}\) Let us remind ourselves of the definition, \(\cosh x = \frac{e^x + e^{-x}}{2}\) and \(\sinh x = \frac{e^x - e^{-x}}{2}\text{.}\) Therefore, these are solutions by superposition as they are linear combinations of the two exponential solutions.

The functions \(\sinh\) and \(\cosh\) are sometimes more convenient to use than the exponential. Let us review some of their properties.

\begin{equation*}
\begin{aligned}
& \cosh 0 = 1 & & \sinh 0 = 0 \\
& \frac{d}{dx} \cosh x = \sinh x & & \frac{d}{dx} \sinh x = \cosh x \\
& \cosh^2 x - \sinh^2 x = 1
\end{aligned}
\end{equation*}

Derive these properties using the definitions of \(\sinh\) and \(\cosh\) in terms of exponentials.

Linear equations have nice and simple answers to the existence and uniqueness question.

Suppose \(p, q, f\) are continuous functions on some interval \(I\text{,}\) \(a\) is a number in \(I\text{,}\) and \(a, b_0, b_1\) are constants. The equation

\begin{equation*}
y'' + p(x) y' + q(x) y = f(x) ,
\end{equation*}

has exactly one solution \(y(x)\) defined on the same interval \(I\) satisfying the initial conditions

\begin{equation*}
y(a) = b_0 , \qquad y'(a) = b_1 .
\end{equation*}

For example, the equation \(y'' + k^2 y = 0\) with \(y(0) = b_0\) and \(y'(0) = b_1\) has the solution

\begin{equation*}
y(x) = b_0 \cos (kx) + \frac{b_1}{k} \sin (kx) .
\end{equation*}

The equation \(y'' - k^2 y = 0\) with \(y(0) = b_0\) and \(y'(0) = b_1\) has the solution

\begin{equation*}
y(x) = b_0 \cosh (kx) + \frac{b_1}{k} \sinh (kx) .
\end{equation*}

Using \(\cosh\) and \(\sinh\) in this solution allows us to solve for the initial conditions in a cleaner way than if we have used the exponentials.

The initial conditions for a second order ODE consist of two equations. Common sense tells us that if we have two arbitrary constants and two equations, then we should be able to solve for the constants and find a solution to the differential equation satisfying the initial conditions.

Question: Suppose we find two different solutions \(y_1\) and \(y_2\) to the homogeneous equation (2). Can every solution be written (using superposition) in the form \(y = C_1 y_1 + C_2 y_2\text{?}\)

Answer is affirmative! Provided that \(y_1\) and \(y_2\) are different enough in the following sense. We say \(y_1\) and \(y_2\) are *linearly independent* if one is not a constant multiple of the other.

Let \(p, q\) be continuous functions. Let \(y_1\) and \(y_2\) be two linearly independent solutions to the homogeneous equation (2). Then every other solution is of the form

\begin{equation*}
y = C_1 y_1 + C_2 y_2 .
\end{equation*}

That is, \(y = C_1 y_1 + C_2 y_2\) is the general solution.

For example, we found the solutions \(y_1 = \sin x\) and \(y_2 = \cos x\) for the equation \(y'' + y = 0\text{.}\) It is not hard to see that sine and cosine are not constant multiples of each other. If \(\sin x = A \cos x\) for some constant \(A\text{,}\) we let \(x=0\) and this would imply \(A = 0\text{.}\) But then \(\sin x = 0\) for all \(x\text{,}\) which is preposterous. So \(y_1\) and \(y_2\) are linearly independent. Hence

\begin{equation*}
y = C_1 \cos x + C_2 \sin x
\end{equation*}

is the general solution to \(y'' + y = 0\text{.}\)

We will study the solution of nonhomogeneous equations in Section 2.5. We will first focus on finding general solutions to homogeneous equations.

Show that \(y=e^x\) and \(y=e^{2x}\) are linearly independent.

Take \(y'' + 5 y = 10 x + 5\text{.}\) Find (guess!) a solution.

Prove the superposition principle for nonhomogeneous equations. Suppose that \(y_1\) is a solution to \(L y_1 = f(x)\) and \(y_2\) is a solution to \(L y_2 = g(x)\) (same linear operator \(L\)). Show that \(y = y_1+y_2\) solves \(Ly = f(x) + g(x)\text{.}\)

For the equation \(x^2 y'' - x y' = 0\text{,}\) find two solutions, show that they are linearly independent and find the general solution. Hint: Try \(y = x^r\text{.}\)

Equations of the form \(a x^2 y'' + b x y' + c y = 0\) are called *Euler's equations* or *Cauchy-Euler equations*. They are solved by trying \(y=x^r\) and solving for \(r\) (assume that \(x \geq 0\) for simplicity).

Suppose that \({(b-a)}^2-4ac > 0\text{.}\) a) Find a formula for the general solution of \(a x^2 y'' + b x y' + c y = 0\text{.}\) Hint: Try \(y=x^r\) and find a formula for \(r\text{.}\) b) What happens when \({(b-a)}^2-4ac = 0\) or \({(b-a)}^2-4ac < 0\text{?}\)

We will revisit the case when \({(b-a)}^2-4ac < 0\) later.

Same equation as in Exercise 2.1.6. Suppose \({(b-a)}^2-4ac = 0\text{.}\) Find a formula for the general solution of \(a x^2 y'' + b x y' + c y = 0\text{.}\) Hint: Try \(y=x^r \ln x\) for the second solution.

If you have one solution to a second order linear homogeneous equation you can find another one. This is the *reduction of order method*.

Suppose \(y_1\) is a solution to \(y'' + p(x) y' + q(x) y = 0\text{.}\) Show that

\begin{equation*}
y_2(x) = y_1(x) \int \frac{e^{-\int p(x)\,dx}}{{\bigl(y_1(x)\bigr)}^2} ~dx
\end{equation*}

is also a solution.

Note: If you wish to come up with the formula for reduction of order yourself, start by trying \(y_2(x) = y_1(x) v(x)\text{.}\) Then plug \(y_2\) into the equation, use the fact that \(y_1\) is a solution, substitute \(w = v'\text{,}\) and you have a first order linear equation in \(w\text{.}\) Solve for \(w\) and then for \(v\text{.}\) When solving for \(w\text{,}\) make sure to include a constant of integration. Let us solve some famous equations using the method.

Take \((1-x^2)y''-xy' + y = 0\text{.}\) a) Show that \(y=x\) is a solution. b) Use reduction of order to find a second linearly independent solution. c) Write down the general solution.

Take \(y''-2xy' + 4y = 0\text{.}\) a) Show that \(y=1-2x^2\) is a solution. b) Use reduction of order to find a second linearly independent solution. c) Write down the general solution.

Are \(\sin(x)\) and \(e^x\) linearly independent? Justify.

Answer

Yes. To justify try to find a constant \(A\) such that \(\sin(x) = A e^x\) for all \(x\text{.}\)

Are \(e^x\) and \(e^{x+2}\) linearly independent? Justify.

Answer

No. \(e^{x+2} = e^2 e^x\text{.}\)

Guess a solution to \(y'' + y' + y= 5\text{.}\)

Answer

\(y=5\)

Find the general solution to \(x y'' + y' = 0\text{.}\) Hint: Notice that it is a first order ODE in \(y'\text{.}\)

Answer

\(y=C_1 \ln(x) + C_2\)

Write down an equation (guess) for which we have the solutions \(e^x\) and \(e^{2x}\text{.}\) Hint: Try an equation of the form \(y''+Ay'+By = 0\) for constants \(A\) and \(B\text{,}\) plug in both \(e^x\) and \(e^{2x}\) and solve for \(A\) and \(B\text{.}\)

Answer

\(y''-3y'+2y = 0\)