[Notes on Diffy Qs home] [PDF version] [Buy paperback on Amazon]

[next] [prev] [prev-tail] [tail] [up]

Note: 2 lectures, §5.2 in [EP], part of §7.3, §7.5, and §7.6 in [BD]

In this section we will learn how to solve linear homogeneous constant coeﬃcient systems of ODEs by the eigenvalue method. Suppose we have such a system

where is a constant square matrix. We wish to adapt the method for the single constant coeﬃcient equation by trying the function . However, is a vector. So we try , where is an arbitrary constant vector. We plug this into the equation to get

We divide by and notice that we are looking for a scalar and a vector that satisfy the equation

To solve this equation we need a little bit more linear algebra, which we now review.

Let be a constant square matrix. Suppose there is a scalar and a nonzero vector such that

We then call an eigenvalue of and is said to be a corresponding eigenvector.

Let us see how to compute eigenvalues for any matrix. Rewrite the equation for an eigenvalue as

This equation has a nonzero solution only if is not invertible. Were it invertible, we could write , which implies . Therefore, has the eigenvalue if and only if solves the equation

Consequently, we will be able to ﬁnd an eigenvalue of without ﬁnding a corresponding eigenvector. An eigenvector will have to be found later, once is known.

For an matrix, the polynomial we get by computing is of degree , and hence in general, we have eigenvalues. Some may be repeated, some may be complex.

To ﬁnd an eigenvector corresponding to an eigenvalue , we write

and solve for a nontrivial (nonzero) vector . If is an eigenvalue, there will be at least one free variable, and so for each distinct eigenvalue , we can always ﬁnd an eigenvector.

Example 3.4.3: Find an eigenvector of corresponding to the eigenvalue .

We write

It is easy to solve this system of linear equations. We write down the augmented matrix

and perform row operations (exercise: which ones?) until we get:

The entries of have to satisfy the equations , , and is a free variable. We can pick to be arbitrary (but nonzero), let , and of course . For example, if we pick , then . Let us verify that really is an eigenvector corresponding to :

Yay! It worked.

Exercise 3.4.1 (easy): Are eigenvectors unique? Can you ﬁnd a diﬀerent eigenvector for in the example above? How are the two eigenvectors related?

Exercise 3.4.2: When the matrix is you do not need to write down the augmented matrix and do row operations when computing eigenvectors (if you have computed the eigenvalues correctly). Can you see why? Explain. Try it for the matrix .

OK. We have the system of equations

We ﬁnd the eigenvalues , , …, of the matrix , and corresponding eigenvectors , , …, . Now we notice that the functions , , …, are solutions of the system of equations and hence is a solution.

Theorem 3.4.1. Take . If is an constant matrix that has distinct real eigenvalues , , …, , then there exist linearly independent corresponding eigenvectors , , …, , and the general solution to can be written as

The corresponding fundamental matrix solution is . That is, is the matrix whose column is .

Example 3.4.4: Consider the system

Find the general solution.

Earlier, we found the eigenvalues are . We found the eigenvector for the eigenvalue 3. Similarly we ﬁnd the eigenvector for the eigenvalue 1, and for the eigenvalue 2 (exercise: check). Hence our general solution is

In terms of a fundamental matrix solution

Note: If we write a homogeneous linear constant coeﬃcient order equation as a ﬁrst order system (as we did in § 3.1), then the eigenvalue equation

is essentially the same as the characteristic equation we got in § 2.2 and § 2.3.

A matrix might very well have complex eigenvalues even if all the entries are real. For example, suppose that we have the system

Let us compute the eigenvalues of the matrix .

Thus . Corresponding eigenvectors are also complex. First take ,

The equations and are multiples of each other. So we only need to consider one of them. After picking , for example, we have an eigenvector . In similar fashion we ﬁnd that is an eigenvector corresponding to the eigenvalue .We could write the solution as

We would then need to look for complex values and to solve any initial conditions. It is perhaps not completely clear that we get a real solution. We could use Euler’s formula and do the whole song and dance we did before, but we will not. We will do something a bit smarter ﬁrst.

We claim that we did not have to look for a second eigenvector (nor for the second eigenvalue). All complex eigenvalues come in pairs (because the matrix is real).

First a small side note. The real part of a complex number can be computed as , where the bar above means . This operation is called the complex conjugate. If is a real number, then . Similarly we can bar whole vectors or matrices. If a matrix is real, then . We note that . Therefore,

So if is an eigenvalue, then so is . And if is an eigenvector corresponding to the eigenvalue , then is an eigenvector corresponding to the eigenvalue .

Suppose is a complex eigenvalue of , and is a corresponding eigenvector. Then

is a solution (complex valued) of . Euler’s formula shows that , and so

is also a solution. As and are solutions, the function

is also a solution. And is real-valued! Similarly as is the imaginary part, we ﬁnd that

is also a real-valued solution. It turns out that and are linearly independent. We will use Euler’s formula to separate out the real and imaginary part.

Returning to our problem,

Then

are the two real-valued linearly independent solutions we seek.The general solution is

This solution is real-valued for real and . Now we can solve for any initial conditions that we may have.

Let us summarize as a theorem.

Theorem 3.4.2. Let be a real-valued constant matrix. If has a complex eigenvalue and a corresponding eigenvector , then also has a complex eigenvalue with a corresponding eigenvector . Furthermore, has two linearly independent real-valued solutions

For each pair of complex eigenvalues and , we get two real-valued linearly independent solutions. We then go on to the next eigenvalue, which is either a real eigenvalue or another complex eigenvalue pair. If we have distinct eigenvalues (real or complex), then we end up with linearly independent solutions.

We can now ﬁnd a real-valued general solution to any homogeneous system where the matrix has distinct eigenvalues. When we have repeated eigenvalues, matters get a bit more complicated and we will look at that situation in § 3.7.

Exercise 3.4.5 (easy): Let be a matrix with an eigenvalue of 3 and a corresponding eigenvector . Find .

Exercise 3.4.6: a) Find the general solution of , using the eigenvalue method (ﬁrst write the system in the form ). b) Solve the system by solving each equation separately and verify you get the same general solution.