Skip to main content
Logo image

Section 3.3 Linear systems of ODEs

Note: less than 1 lecture, second part of §5.1 in [EP], §7.4 in [BD]
First let us talk about matrix- or vector-valued functions. Such a function is just a matrix or vector whose entries depend on some variable. If \(t\) is the independent variable, we write a vector-valued function \(\vec{x}(t)\) as
\begin{equation*} \vec{x}(t) = \begin{bmatrix} x_1(t) \\ x_2(t) \\ \vdots \\ x_n(t) \end{bmatrix} . \end{equation*}
Similarly a matrix-valued function \(A(t)\) is
\begin{equation*} A(t) = \begin{bmatrix} a_{11}(t) & a_{12}(t) & \cdots & a_{1n}(t) \\ a_{21}(t) & a_{22}(t) & \cdots & a_{2n}(t) \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1}(t) & a_{n2}(t) & \cdots & a_{nn}(t) \end{bmatrix} . \end{equation*}
The derivative \(A'(t)\) or \(\frac{dA}{dt}\) is just the matrix-valued function whose \(ij^{\text{th}}\) entry is \(a_{ij}'(t)\text{.}\)
Rules of differentiation of matrix-valued functions are similar to rules for normal functions. Let \(A(t)\) and \(B(t)\) be matrix-valued functions. Let \(c\) a scalar and let \(C\) be a constant matrix. Then
\begin{equation*} \begin{aligned} \bigl(A(t)+B(t)\bigr)' & = A'(t) + B'(t), \\ \bigl(A(t)B(t)\bigr)' & = A'(t)B(t) + A(t)B'(t), \\ \bigl(cA(t)\bigr)' & = cA'(t), \\ \bigl(CA(t)\bigr)' & = CA'(t), \\ \bigl(A(t)\,C\bigr)' & = A'(t)\,C . \end{aligned} \end{equation*}
Note the order of the multiplication in the last two expressions.
A first order linear system of ODEs is a system that can be written as the vector equation
\begin{equation*} {\vec{x}}'(t) = P(t)\vec{x}(t) + \vec{f}(t), \end{equation*}
where \(P(t)\) is a matrix-valued function, and \(\vec{x}(t)\) and \(\vec{f}(t)\) are vector-valued functions. We will often suppress the dependence on \(t\) and only write \({\vec{x}}' = P\vec{x} + \vec{f}\text{.}\) A solution of the system is a vector-valued function \(\vec{x}\) satisfying the vector equation.
For example, the equations
\begin{equation*} \begin{aligned} x_1' &= 2t x_1 + e^t x_2 + t^2 , \\ x_2' &= \frac{x_1}{t} -x_2 + e^t , \end{aligned} \end{equation*}
can be written as
\begin{equation*} {\vec{x}}' = \begin{bmatrix} 2t & e^t \\ \nicefrac{1}{t} & -1 \end{bmatrix} \vec{x} + \begin{bmatrix} t^2 \\ e^t \end{bmatrix} . \end{equation*}
We will mostly concentrate on equations that are not just linear, but are in fact constant coefficient equations. That is, the matrix \(P\) will be constant; it will not depend on \(t\text{.}\)
When \(\vec{f} = \vec{0}\) (the zero vector), then we say the system is homogeneous. For homogeneous linear systems we have the principle of superposition, just like for single homogeneous equations.
Linear independence for vector-valued functions is the same idea as for normal functions. The vector-valued functions \(\vec{x}_1,\vec{x}_2,\ldots,\vec{x}_n\) are linearly independent when
\begin{equation*} c_1 \vec{x}_1 + c_2 \vec{x}_2 + \cdots + c_n \vec{x}_n = \vec{0} \end{equation*}
has only the solution \(c_1 = c_2 = \cdots = c_n = 0\text{,}\) where the equation must hold for all \(t\text{.}\)

Example 3.3.1.

\(\vec{x}_1 = \Bigl[ \begin{smallmatrix} t^2 \\ t \end{smallmatrix} \Bigr]\text{,}\) \(\vec{x}_2 = \Bigl[ \begin{smallmatrix} 0 \\ 1+t \end{smallmatrix} \Bigr]\text{,}\) \(\vec{x}_3 = \Bigl[ \begin{smallmatrix} -t^2 \\ 1 \end{smallmatrix} \Bigr]\) are linearly dependent because \(\vec{x}_1 + \vec{x}_3 = \vec{x}_2\text{,}\) and this holds for all \(t\text{.}\) So \(c_1 = 1\text{,}\) \(c_2 = -1\text{,}\) and \(c_3 = 1\) above will work.
On the other hand if we change the example just slightly \(\vec{x}_1 = \Bigl[ \begin{smallmatrix} t^2 \\ t \end{smallmatrix} \Bigr]\text{,}\) \(\vec{x}_2 = \Bigl[ \begin{smallmatrix} 0 \\ t \end{smallmatrix} \Bigr]\text{,}\) \(\vec{x}_3 = \Bigl[ \begin{smallmatrix} -t^2 \\ 1 \end{smallmatrix} \Bigr]\text{,}\) then the functions are linearly independent. First write \(c_1 \vec{x}_1 + c_2 \vec{x}_2 + c_3 \vec{x}_3 = \vec{0}\) and note that it has to hold for all \(t\text{.}\) We get that
\begin{equation*} c_1 \vec{x}_1 + c_2 \vec{x}_2 + c_3 \vec{x}_3 = \begin{bmatrix} c_1 t^2 - c_3 t^2 \\ c_1 t + c_2 t + c_3 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} . \end{equation*}
In other words \(c_1 t^2 - c_3 t^2 = 0\) and \(c_1 t + c_2 t + c_3 = 0\text{.}\) If we set \(t = 0\text{,}\) then the second equation becomes \(c_3 = 0\text{.}\) But then the first equation becomes \(c_1 t^2 = 0\) for all \(t\) and so \(c_1 = 0\text{.}\) Thus the second equation is just \(c_2 t = 0\text{,}\) which means \(c_2 = 0\text{.}\) So \(c_1 = c_2 = c_3 = 0\) is the only solution and \(\vec{x}_1\text{,}\) \(\vec{x}_2\text{,}\) and \(\vec{x}_3\) are linearly independent.
The linear combination \(c_1 \vec{x}_1 + c_2 \vec{x}_2 + \cdots + c_n \vec{x}_n\) could always be written as
\begin{equation*} X(t)\,\vec{c} , \end{equation*}
where \(X(t)\) is the matrix with columns \(\vec{x}_1, \vec{x}_2, \ldots, \vec{x}_n\text{,}\) and \(\vec{c}\) is the column vector with entries \(c_1, c_2, \ldots, c_n\text{.}\) Assuming that \(\vec{x}_1,\vec{x}_2,\ldots,\vec{x}_n\) are linearly independent, the matrix-valued function \(X(t)\) is called a fundamental matrix, or a fundamental matrix solution.
To solve nonhomogeneous first order linear systems, we use the same technique as we applied to solve single linear nonhomogeneous equations.
The procedure for systems is the same as for single equations. We find a particular solution to the nonhomogeneous equation, then we find the general solution to the associated homogeneous equation, and finally we add the two together.
Alright, suppose you have found the general solution of \({\vec{x}}' = P\vec{x} + \vec{f}\text{.}\) Next suppose you are given an initial condition of the form
\begin{equation*} \vec{x}(t_0) = \vec{b} \end{equation*}
for some fixed \(t_0\) and a constant vector \(\vec{b}\text{.}\) Let \(X(t)\) be a fundamental matrix solution of the associated homogeneous equation (i.e. columns of \(X(t)\) are solutions). The general solution can be written as
\begin{equation*} \vec{x}(t) = X(t)\,\vec{c} + \vec{x}_p(t). \end{equation*}
We are seeking a vector \(\vec{c}\) such that
\begin{equation*} \vec{b} = \vec{x}(t_0) = X(t_0)\,\vec{c} + \vec{x}_p(t_0). \end{equation*}
In other words, we are solving for \(\vec{c}\) the nonhomogeneous system of linear equations
\begin{equation*} X(t_0)\,\vec{c} = \vec{b} - \vec{x}_p(t_0) . \end{equation*}

Example 3.3.2.

In Section 3.1 we solved the system
\begin{equation*} \begin{aligned} x_1' & = x_1 , \\ x_2' & = x_1 - x_2 , \end{aligned} \end{equation*}
with initial conditions \(x_1(0) = 1\text{,}\) \(x_2(0) = 2\text{.}\) Let us consider this problem in the language of this section.
The system is homogeneous, so \(\vec{f}(t) = \vec{0}\text{.}\) We write the system and the initial conditions as
\begin{equation*} {\vec{x}}' = \begin{bmatrix} 1 & 0 \\ 1 & -1 \end{bmatrix} \vec{x} , \qquad \vec{x}(0) = \begin{bmatrix} 1 \\ 2 \end{bmatrix} . \end{equation*}
We found the general solution is \(x_1 = c_1 e^t \) and \(x_2 = \frac{c_1}{2}e^{t} + c_2e^{-t}\text{.}\) Letting \(c_1=1\) and \(c_2=0\text{,}\) we obtain the solution \(\left[ \begin{smallmatrix} e^t \\ (1/2) e^t \end{smallmatrix} \right]\text{.}\) Letting \(c_1=0\) and \(c_2=1\text{,}\) we obtain \(\left[ \begin{smallmatrix} 0 \\ e^{-t} \end{smallmatrix} \right]\text{.}\) These two solutions are linearly independent, as can be seen by setting \(t=0\text{,}\) and noting that the resulting constant vectors are linearly independent. In matrix notation, a fundamental matrix solution is, therefore,
\begin{equation*} X(t) = \begin{bmatrix} e^t & 0 \\ \frac{1}{2} e^t & e^{-t} \end{bmatrix} . \end{equation*}
To solve the initial value problem we solve for \(\vec{c}\) in the equation
\begin{equation*} X(0)\,\vec{c} = \vec{b} , \end{equation*}
or in other words,
\begin{equation*} \begin{bmatrix} 1 & 0 \\ \frac{1}{2} & 1 \end{bmatrix} \vec{c} = \begin{bmatrix} 1 \\ 2 \end{bmatrix} . \end{equation*}
A single elementary row operation shows \(\vec{c} = \left[ \begin{smallmatrix} 1 \\ 3/2 \end{smallmatrix} \right]\text{.}\) Our solution is
\begin{equation*} \vec{x}(t) = X(t)\,\vec{c} = \begin{bmatrix} e^t & 0 \\ \frac{1}{2} e^t & e^{-t} \end{bmatrix} \begin{bmatrix} 1 \\ \frac{3}{2} \end{bmatrix} = \begin{bmatrix} e^t \\ \frac{1}{2} e^t + \frac{3}{2} e^{-t} \end{bmatrix} . \end{equation*}
This new solution agrees with our previous solution from Section 3.1.

Subsection 3.3.1 Exercises

Exercise 3.3.1.

Write the system \(x_1' = 2 x_1 - 3t x_2 + \sin t\text{,}\) \(x_2' = e^t x_1 + 3 x_2 + \cos t\) in the form \({\vec{x}}' = P(t) \vec{x} + \vec{f}(t)\text{.}\)

Exercise 3.3.2.

  1. Verify that the system \({\vec{x}}' = \left[ \begin{smallmatrix} 1 & 3 \\ 3 & 1 \end{smallmatrix} \right] \vec{x}\) has the two solutions \(\left[ \begin{smallmatrix} 1 \\ 1 \end{smallmatrix} \right] e^{4t}\) and \(\left[ \begin{smallmatrix} 1 \\ -1 \end{smallmatrix} \right] e^{-2t}\text{.}\)
  2. Write down the general solution.
  3. Write down the general solution in the form \(x_1 = ?\text{,}\) \(x_2 = ?\) (i.e. write down a formula for each element of the solution).

Exercise 3.3.3.

Verify that \(\left[ \begin{smallmatrix} 1 \\ 1 \end{smallmatrix} \right] e^{t}\) and \(\left[ \begin{smallmatrix} 1 \\ -1 \end{smallmatrix} \right] e^{t}\) are linearly independent. Hint: Just plug in \(t=0\text{.}\)

Exercise 3.3.4.

Verify that \(\left[ \begin{smallmatrix} 1 \\ 1 \\ 0 \end{smallmatrix} \right] e^{t}\) and \(\left[ \begin{smallmatrix} 1 \\ -1 \\ 1 \end{smallmatrix} \right] e^{t}\) and \(\left[ \begin{smallmatrix} 1 \\ -1 \\ 1 \end{smallmatrix} \right] e^{2t}\) are linearly independent. Hint: You must be a bit more tricky than in the previous exercise.

Exercise 3.3.5.

Verify that \(\left[ \begin{smallmatrix} t \\ t^2 \end{smallmatrix} \right]\) and \(\left[ \begin{smallmatrix} t^3 \\ t^4 \end{smallmatrix} \right]\) are linearly independent.

Exercise 3.3.6.

Take the system \(x_1' + x_2' = x_1\text{,}\) \(x_1' - x_2' = x_2\text{.}\)
  1. Write it in the form \(A {\vec{x}}' = B \vec{x}\) for matrices \(A\) and \(B\text{.}\)
  2. Compute \(A^{-1}\) and use that to write the system in the form \({\vec{x}}' = P \vec{x}\text{.}\)

Exercise 3.3.101.

Are \(\left[ \begin{smallmatrix} e^{2t} \\ e^t \end{smallmatrix}\right]\) and \(\left[ \begin{smallmatrix} e^{t} \\ e^{2t} \end{smallmatrix}\right]\) linearly independent? Justify.
Answer.
Yes.

Exercise 3.3.102.

Are \(\left[ \begin{smallmatrix} \cosh(t) \\ 1 \end{smallmatrix}\right]\text{,}\) \(\left[ \begin{smallmatrix} e^{t} \\ 1 \end{smallmatrix}\right]\text{,}\) and \(\left[ \begin{smallmatrix} e^{-t} \\ 1 \end{smallmatrix}\right]\) linearly independent? Justify.
Answer.
No. \(2 \left[ \begin{smallmatrix} \cosh(t) \\ 1 \end{smallmatrix}\right] - \left[ \begin{smallmatrix} e^{t} \\ 1 \end{smallmatrix}\right] - \left[ \begin{smallmatrix} e^{-t} \\ 1 \end{smallmatrix}\right] = \vec{0}\)

Exercise 3.3.103.

Write \(x'=3x-y+e^t\text{,}\) \(y'=tx\) in matrix notation.
Answer.
\(\left[ \begin{smallmatrix} x \\ y \end{smallmatrix}\right] ' = \left[ \begin{smallmatrix} 3 & -1 \\ t & 0 \end{smallmatrix}\right] \left[ \begin{smallmatrix} x \\ y \end{smallmatrix}\right] + \left[ \begin{smallmatrix} e^{t} \\ 0 \end{smallmatrix}\right]\)

Exercise 3.3.104.

  1. Write \(x_1'=2tx_2\text{,}\) \(x_2'=2tx_2\) in matrix notation.
  2. Solve and write the solution in matrix notation.
Answer.
a) \(\vec{x}\,' = \left[ \begin{smallmatrix} 0 & 2t \\ 0 & 2t \end{smallmatrix}\right] \vec{x}\)     b) \(\vec{x} = \left[ \begin{smallmatrix} C_2 e^{t^2} + C_1 \\ C_2 e^{t^2} \end{smallmatrix}\right]\)
For a higher quality printout use the PDF version: https://www.jirka.org/diffyqs/diffyqs.pdf