-
1. Introduction
In this unit we look at calculating eigenvalues and eigenvectors of $2 \times 2$ and $3 \times 3$ matrices before investigating their role in the diagonalisation of square matrices. The diagonalisation process is then applied to demonstrate how algebraic techniques are used to solve linear systems of ordinary differential equations.
-
2. Eigenvalues and eigenvectors of a $2 \times 2$ matrix
Eigenvalues and eigenvectors have many important applications in science and engineering including solving systems of differential equations, stability analysis, vibration analysis and modelling population dynamics.
Let $A$ be a $n \times n$ matrix. An eigenvalue of $A$ is a scalar $\lambda$ (real or complex) such that
$A\boldsymbol{x} = \lambda \boldsymbol{x}$(I)
for some non-zero vector $\boldsymbol{x}$. In this case, we call the vector $\boldsymbol{x}$ an eigenvector of $A$ corresponding to $\lambda$. Geometrically Eq. (I) means that the vectors $A\boldsymbol{x}$ and $\boldsymbol{x}$ are parallel. The value of $\lambda$ determines what happens to $\boldsymbol{x}$ when it is multiplied by $A$, i.e. whether it is shrunk or stretched or if its direction is unchanged or reversed.
Example 1
If $A = \pmatrix{1 & 3 \\ 4 & 2}$ and $\boldsymbol{x} = \pmatrix{3 \\ 4}$, then $A\boldsymbol{x} = \pmatrix{15 \\ 20}$ $= 5\pmatrix{3 \\ 4}$.
Here we have that $A\boldsymbol{x} = 5 \boldsymbol{x}$ and so we say that $\boldsymbol{x} = \pmatrix{3 \\ 4}$ is an eigenvector of $A$ corresponding to the eigenvalue $\lambda = 5$ .
The geometric effect in this example is that the vector $\boldsymbol{x}$ has been stretched by a factor of 5 but its direction remains unchanged as $\lambda = 5$.
Note that any scalar multiple of the vector $\boldsymbol{x} = \pmatrix{3 \\ 4}$ is an eigenvector corresponding to the eigenvalue $\lambda = 5$.
End of Example 1What are eigenvalues and eigenvectors?
📹 What are Eigenvalues and Eigenvectors?
📹 Introduction to Eigenvalues and Eigenvectors - Part 1
2.1. Calculation of eigenvalues
If $A$ is a $2 \times 2$ matrix it is relatively straightforward to calculate its eigenvalues and eigenvectors by hand. So, how do we calculate them?
We know that $\boldsymbol{x} = I \boldsymbol{x}$, where $I$ is the identity matrix, so we can rewrite Eq. (I) as
$A\boldsymbol{x} = \lambda I \boldsymbol{x}$
$A\boldsymbol{x} - \lambda I \boldsymbol{x} = 0$
$(A - \lambda I)\boldsymbol{x} = 0$
If the matrix $(A - \lambda I)$ is invertible, i.e. $\text{det}(A - \lambda I) \neq 0$, then the only solution to the above equation is the zero vector, i.e. $\boldsymbol{x} = \boldsymbol{0}$. We are not interested in this case as an eigenvector must be non-zero.
The equation $(A - \lambda I)\boldsymbol{x} = \boldsymbol{0}$ can only hold for a non-zero vector $\boldsymbol{x}$ if the matrix $(A - \lambda I)$ is singular (does not have an inverse). Hence, the eigenvalues of $A$ are the numbers $\lambda$ for which the matrix $(A - \lambda I)$ does not have an inverse.
In other words the numbers $\lambda$ satisfy the equation
$\text{det}(A - \lambda I) = 0$ (II)
and they can be real or complex.
2.1.1. Real distinct eigenvalues
We firstly look at the case where an $n \times n$ matrix has $n$ distinct eigenvalues.
Example 2
Find the eigenvalues of the following matrices :
(i) $A = \pmatrix{5 & -2 \\ 7 & -4}$ (ii). $B = \pmatrix{13 & -4 \\ -4 & \;\;\;7}$(iii). $C = \pmatrix{0 & 2 \\ 2 & 0}$
Solution
(i). $A - \lambda I = \pmatrix{5 & -2 \\ 7 & -4} - \lambda \pmatrix{1 & 0 \\ 0 & 1} = \pmatrix{5 - \lambda & -2 \\ \;\;7 & -4-\lambda}$.
Hence, $\text{det}(A - \lambda I) = \bigg | \matrix{5 - \lambda & -2 \\ \;\;7 & -4-\lambda} \bigg | = (5 - \lambda)(-4-\lambda) - (-2)(7) = \lambda^2 - \lambda - 6$.
We call $\lambda^2 - \lambda - 6$ the characteristic polynomial of the matrix $A$.
The eigenvalues of $A$ are the roots of the characteristic equation $\text{det}(A - \lambda I) = 0$, i.e.
$\lambda^2 - \lambda - 6 = 0 \implies (\lambda - 3)(\lambda + 2) = 0 \implies \lambda = -2$ and $\lambda = 3$.
Hence, $\lambda_1 = -2$ and $\lambda_2 = 3$ are the eigenvalues of the matrix $A$.
(ii). $B - \lambda I = \pmatrix{\;13 & -4 \\ -4 & \;\;\;7 } - \lambda \pmatrix{1 & 0 \\ 0 & 1} = \pmatrix{13-\lambda & -4 \\ -4 & 7-\lambda}$
Hence, $\text{det}(B - \lambda I) = \bigg | \matrix{13-\lambda & -4 \\ -4 & \;7-\lambda} \bigg | = (13-\lambda)(7-\lambda) - (-4)(-4) = \lambda^2 - 20 \lambda + 75$.
Now solve $\det(B - \lambda I) = 0$ to find the eigenvalues of $B$, i.e.
$\lambda^2 - 20 \lambda + 75 - 0 \implies (\lambda - 5)(\lambda - 15) = 0 \implies \lambda = 5$ and $\lambda = 15$.
Hence, $\lambda_1 = 5$ and $\lambda_2 = 15$ are the eigenvalues of the matrix $B$.
(iii). $C - \lambda I = \pmatrix{0 & 2 \\2 & 0} - \lambda \pmatrix{1 & 0 \\ 0 & 1} = \pmatrix{-\lambda & \;\;2 \\ \;\;2 & -\lambda}$
Hence, $\text{det}(C - \lambda I) = \bigg | \matrix{-\lambda & \;\;2 \\ \;\;2 & -\lambda} \bigg | = (-\lambda)(-\lambda) - (2)(2) = \lambda^2 - 4$.
The eigenvalues of $C$ satisfy $\text{det}(C - \lambda I) = 0$, i.e. $\lambda^2 - 4 = 0 \implies \lambda = \pm 2$.
Hence, $\lambda_1 = -2$ and $\lambda_2 = 2$ are the eigenvalues of the matrix $C$.
End of Example 2The following example demonstrates a short-cut approach that can be adopted when calculating the eigenvalues of specific types of matrices.
Example 3
Find the eigenvalues of the following matrices:
(i). $A = \pmatrix{5 & 0 \\ 0 & 8}$ (ii). $\pmatrix{3 & \;\;7 \\ 0 & -4}$ (iii). $C = \pmatrix{1 & 0 \\ 3 & 2}$.
- In this example we note that:
- matrix $A$ is a diagonal matrix (see Section 4.4) and has the property that all of its entries not on the main diagonal are $0$.
- matrix $B$ is an upper-triangular matrix (see Section 4.5) and has the property that all of its entries below the main diagonal are $0$
- matrix $C$ is a lower-triangular matrix (see Section 4.5) and has the property that all of its entries above the main diagonal are $0$.
Note that, in each case, some of the entries on the main diagonal can be zero.
Solution
- In all three cases – diagonal, upper-triangular and lower triangular - the eigenvalues are simply the entries on the main diagonal and so we can just read them off without the need for any calculations. Hence,
- The eigenvalues of matrix $A$ are: $\lambda_1 = 5,\; \lambda_2 = 8$.
- The eigenvalues of matrix $B$ are: $\lambda_1 = 3,\; \lambda_2 = -4$.
- The eigenvalues of matrix $C$ are: $\lambda_1 = 1, \; \lambda_2 = 2$.
We verify our answers using the method described earlier.
(i). Solving $\text{det}(A - \lambda I) = 0$ gives $\bigg | \matrix{5-\lambda & \;\,0 \\ \;\,0 & 8-\lambda} \bigg| = 0 \implies (5-\lambda)(8-\lambda) = 0 \implies \lambda_1 = 5, \lambda_2 = 8$.
(ii). Solving $\text{det}(B - \lambda I) = 0$ gives $\bigg | \matrix{3-\lambda & \;\,7 \\ \;\,0 & -4-\lambda} \bigg| = 0 \implies (3-\lambda)(-4-\lambda) = 0 \implies \lambda_1 = 3, \lambda_2 = -4$.
(iii). Solving $\text{det}(C - \lambda I) = 0$ gives $\bigg | \matrix{1-\lambda & \;\,0 \\ \;\,3 & 2-\lambda} \bigg| = 0 \implies (1-\lambda)(2-\lambda) = 0 \implies \lambda_1 = 1, \lambda_2 = 2$.
End of Example 32.1.2. Repeated eigenvalues
In the examples presented up to now the eigenvalues have been distinct but it is possible for a matrix to have repeated eigenvalues.
Example 4
Find the eigenvalues of the matrix, $A = \pmatrix{3 & -9 \\ 1 & \;\;\;9}$ .
Solution
To find the eigenvalues we solve
$\text{det}(A - \lambda I) = \bigg | \matrix{3-\lambda & -9 \\ \;\;1 & 9-\lambda} \bigg | = 0$
$(3 - \lambda)(9 - \lambda) + 9 = 0$
$\lambda^2 - 12\lambda + 36 = 0$
$(\lambda - 6)(\lambda - 6) + 0$
$\lambda = 6$ (repeated).
The eigenvalue $\lambda = 6$ is said to have algebraic multiplicity $2$, i.e. the number of times it is a root of the characteristic equation.
End of Example 42.1.3. Zero eigenvalues
We have previously noted that an eigenvector cannot be the zero vector, $0$, but it is possible to have an eigenvalue $\lambda = 0$ .
Example 5
Find the eigenvalues and eigenvectors of the matrix,
$A = \pmatrix{\;\;\;3 & -6 \\ -2 & \;\;\;4}$
Solution
To find the eigenvalues we need to solve
$\text{det}(A - \lambda I) = \bigg | \matrix{3-\lambda & -6 \\ -2 & 4-\lambda} \bigg | = 0$
$(3 - \lambda)(4 - \lambda) - 12 = 0$
$\lambda^2 - 7 \lambda = 0$
$\lambda(\lambda - 7) = 0$
$\lambda_1 = 0, \; \lambda_2 = 7$.
This example shows that it is possible for $0$ to be an eigenvalue of a matrix.
Note that if $0$ is an eigenvalue of a matrix then the matrix is not invertible. Hence, the matrix $A$ in this example cannot be inverted.
End of Example 52.1.4. Complex eigenvalues of real matrices
It is possible for a real-valued matrix to have complex eigenvalues (and eigenvectors) as illustrated by the following example.
Example 6
Find the eigenvalues of the matrices:
(i). $A = \pmatrix{0 & -1 \\ 1 & \;\;\;0}$ (ii). $B = \pmatrix{4 & -3 \\ 6 & -2}$.
Solution
(i). $\text{det}(A - \lambda I) = \bigg | \matrix{-\lambda & -1 \\ 1 & -\lambda} \bigg | = 0$
$\lambda^2 + 1 = 0$.
$\lambda_1 = j, \; \lambda_2 = -j$.
(ii). $\text{det}(B - \lambda I) = \bigg | \matrix{4-\lambda & -3 \\ 6 & -2-\lambda} \bigg | = \lambda^2 - 2\lambda + 10 = 0$
Solve using the quadratic formula, or by completing the square, to obtain,
$\lambda_1 = 1 + 3j, \; \lambda_2 = 1 - 3j$.
Note: For a matrix with real entries its complex eigenvalues always occur in complex conjugate pairs.
End of Example 62.1.5. Verification of eigenvalues for a $2 \times 2$ matrix
For a $2 \times 2$ matrix, $A$, we can verify the eigenvalues of $A$ using either of the following approaches:
(1). For each eigenvalue, $\lambda_i$, show that $(A - \lambda_iI) = 0$ .
(2). Check that both the following conditions hold:
(i). $\lambda_1 + \lambda_2 = \text{tr}(A)$.
The sum of the eigenvalues of $A$ must equal the trace of $A$.
The trace of matrix $A$, i.e. $\text{tr}(A)$, is the sum of the elements on the main diagonal of $A$.(ii). $\lambda_1 \lambda_2 = \det(A)$.
The product of the eigenvalues of $A$ must equal the determinant of $A$.Exercise: Check the eigenvalues calculated for the matrices in Examples 2 - 6 using both the approaches described above.
2.2. Calculation of eigenvectors
Once we have calculated the eigenvalues we can find the eigenvectors by solving the matrix equation
$A\boldsymbol{x} = \lambda \boldsymbol{x}$ (III)
or equivalently, as we saw above,
$(A - \lambda I)\boldsymbol{x} = \boldsymbol{0}$
for each eigenvalue in turn.
Calculating eigenvalues and eigenvectors :
📹 Find Eigenvalues and Eigenvectors of a $2 \times 2$ Matrix
📹 Linear Algebra: Ch 3 - Eigenvalues and Eigenvectors (4 of 35)
📹 Linear Algebra: Ch 3 - Eigenvalues and Eigenvectors (5 of 35)
📹 Linear Algebra: Ch 3 - Eigenvalues and Eigenvectors (6 of 35)
2.2.1. Eigenvectors for distinct eigenvalues
We shall first of all look at calculating eigenvectors for matrices with real distinct eigenvalues.
Example 7
Find the eigenvalues and eigenvectors of the matrix, $A = \pmatrix{\;\;\;2 & \;\;\;7 \\ -1 & -6}$.
Solution
First we find the eigenvalues by solving:
$\text{det}(A - \lambda I) = \bigg | \matrix{2-\lambda & \;7 \\ -1 & -6-\lambda} \bigg | = 0$
$(2 - \lambda)(-6 - \lambda) + 7 = 0$
$\lambda^2 + 4 \lambda - 5 = 0$
$(\lambda + 5)(\lambda - 1) = 0$
$\lambda_1 = -5, \; \lambda_2 = 1$.
We now calculate the eigenvectors corresponding to the eigenvalues by solving Eq. (III).
Case 1: To find an eigenvector $\boldsymbol{x}_1$ corresponding to eigenvalue $\lambda_1 = -5$ we solve,
$A\;\boldsymbol{x}_1 = \lambda_1\;\boldsymbol{x}_1$.
$\pmatrix{\;\;\;2 & \;\;\;7 \\ -1 & -6}\pmatrix{x_1 \\ x_2} = -5 \pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{2x_1 + 7x_2 = -5x_1 \\ -x_1 - 6x_2 = -5x_2}$
$\bigg \{ \array{7x_1 + 7x_2 = 0 \; ........ \;(1) \\ -x_1 - x_2 = 0 \; ......... \;(2) }$
These are simultaneous equations and we note here that one equation will always be a multiple of the other - if not then you have made a mistake! Here Eq. (1) is $-7$ times Eq. (2).
Both equations give $x_1 = -x_2$. If we let $x_2 = \alpha$, say, for some non-zero real number $\alpha$, then $x_1 = - \alpha$ and we find the first eigenvector to be of the form
$\boldsymbol{x}_1 = \pmatrix{-\alpha \\ \;\;\; \alpha} = \pmatrix{-1 \\ \;\;\;1} \alpha$.
Note that there are infinitely many non-zero eigenvectors depending on the value chosen for $\alpha$. Setting $\alpha = 1$ gives an eigenvector corresponding to the eigenvalue $\lambda = -5$ as $\boldsymbol{x}_1 = \pmatrix{-1 \\ \;\;\;1}$.
We can check our answer by showing that $A\;\boldsymbol{x}_1 = -5\;\boldsymbol{x}_1$.
$A \;\boldsymbol{x}_1 = \pmatrix{\;\;\;2 & \;\;\; 7 \\ -1 & -6}\pmatrix{-1 \\ \;\;\; 1} = \pmatrix{\;\;\;5 \\ -5}$ and $\lambda_1\;\boldsymbol{x}_1 = -5 \pmatrix{-1 \\ \;\;\;1} = \pmatrix{\;\;\;5 \\ -5}$.
Hence, $A\;\boldsymbol{x}_1 = \lambda_1 \; \boldsymbol{x}_1$ as required.
Case 2: To find an eigenvector $\boldsymbol{x}_2$ corresponding to eigenvalue $\lambda_2 = 1$ we solve,
$A\;\boldsymbol{x}_2 = \lambda_2\;\boldsymbol{x}_2$.
$\pmatrix{\;\;\;2 & \;\;\;7 \\ -1 & -6}\pmatrix{x_1 \\ x_2} = \pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{2x_1 + 7x_2 = x_1 \\ -x_1 - 6x_2 = x_2}$
$\bigg \{ \array{x_1 + 7x_2 = 0 \\ -x_1 - 7x_2 = 0}$
Both these equations give $x_1 = -7x_2$. Let $x_2 = \alpha$, say, for some non-zero real number $\alpha$, then $x_1 = -7\alpha$ and so
$\boldsymbol{x}_2 = \pmatrix{-7\alpha \\ \;\;\;\alpha} = \pmatrix{-7 \\ \;\;\;1}\alpha$.
Setting $\alpha = 1$ gives $\boldsymbol{x}_2 = \pmatrix{-7 \\ \;\;\;1}$. It is straightforward to check that $A\;\boldsymbol{x}_2 = 1\;\boldsymbol{x}_2$.
In summary, we therefore have the eigenvalue/eigenvector pairs,
$\lambda_1 = -5, \; \boldsymbol{x}_1 = \pmatrix{-1 \\ \;\;\;1}$; $\lambda_2 = 1, \; \boldsymbol{x}_2 = \pmatrix{-7 \\ \;\;\;1}$.
End of Example 7Example 8
Find the eigenvalues and eigenvectors of the matrix, $B = \pmatrix{\;13 & -4 \\ -4 & \;\;\; 7}$.
Solution
In Example 2 part (ii) we found the eigenvalues of $B$ to be $\lambda_1 = 5$ and $\lambda_2 = 15$.
We now calculate the eigenvectors corresponding to these eigenvalues by solving the eigenvector equation, $A\; \boldsymbol{x} = \lambda \; \boldsymbol{x}$.
Case 1: To find an eigenvector $\boldsymbol{x}_1$ corresponding to eigenvalue $\lambda_1 = 5$ we solve,
$A \;\boldsymbol{x}_1 = \lambda_1 \; \boldsymbol{x}_1$
$\pmatrix{\;13 & -4 \\ -4 & \;\;\;7}\pmatrix{x_1 \\ x_2} = 5\pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{13x_1 - 4x_2 = 5x_1 \\ -4x_1 + 7x_2 = 5x_2}$
$\bigg \{ \array{8x_1 - 4x_2 = 0 \\ -4x_1 + 2x_2 = 0}$
Both these equations give $x_2 = 2x_1$.
Note that for a $2 \times 2$ system we do not actually need to introduce the parameter $\alpha$ as we did in the previous example. We can simply choose a convenient numerical value for either of the components $x_1$ or $x_2$ of the eigenvector. So here we can let $x_1 = 1$, say, giving $x_2 = 2$.
Thus an eigenvector corresponding to the eigenvalue $\lambda_1 = 5$ is $\boldsymbol{x}_1 = \pmatrix{1 \\ 2}$.
Case 2: To find an eigenvector $\boldsymbol{x}_2$ corresponding to eigenvalue $\lambda_2 = 15$ we solve,
$A \;\boldsymbol{x}_2 = \lambda_2 \; \boldsymbol{x}_2$
$\pmatrix{\;13 & -4 \\ -4 & \;\;\;7}\pmatrix{x_1 \\ x_2} = 15\pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{13x_1 - 4x_2 = 15x_1 \\ -4x_1 + 7x_2 = 15x_2}$
$\bigg \{ \array{-2x_1 - 4x_2 = 0 \\ -4x_1 - 8x_2 = 0}$
Both these equations give $x_1 = -2x_2$. Let $x_2 = 1$, say, giving $x_1 = -2$.
Then an eigenvector corresponding to the eigenvalue $\lambda_2 = 15$ is $\boldsymbol{x}_2 = \pmatrix{-2 \\ 1}$.
In summary, we therefore have the eigenvalue/eigenvector pairs,
$\lambda_1 = 5, \; \boldsymbol{x}_1 = \pmatrix{1 \\ 2}$; $\lambda_2 = 15, \; \boldsymbol{x}_2 = \pmatrix{-2 \\ \;\;\;1}$.
End of Example 8Example 9
Find the eigenvalues and eigenvectors of the matrix, $A = \pmatrix{\;\;\;3 & -6 \\ -2 & \;\;\;4}$.
Solution
To find the eigenvalues we need to solve
$\text{det}(A - \lambda I) = \bigg | \matrix{3 - \lambda & -6 \\ -2 & 4 - \lambda} \bigg | = 0$
$(3 - \lambda)(4 - \lambda) - 12 = 0$
$\lambda^2 - 7\lambda = 0$
$\lambda(\lambda - 7) = 0$
$\lambda_1 = 0, \; \lambda_2 = 7$
We now find the eigenvectors:
Case 1: To find an eigenvector $\boldsymbol{x}_1$ corresponding to eigenvalue $\lambda_1 = 0$ we solve, $A\,\boldsymbol{x}_1 = 0\,\boldsymbol{x}_1$.
$\pmatrix{\;\;\;3 & -6 \\ -2 & \;\;\; 4}\pmatrix{x_1 \\ x_2} = \pmatrix{0 \\ 0}$
$\bigg \{ \array{3x_1 - 6x_2 = 0 \\ -2x_1 + 4x_2 = 0}$.
Both these equations give $x_1 = 2x_2$. Let $x_2 = 1$, say, giving $x_1 = 2$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = 0$ is $\boldsymbol{x}_1 = \pmatrix{2 \\ 1}$.
Case 2: To find an eigenvector $\boldsymbol{x}_2$ corresponding to eigenvalue $\lambda_2 = 7$, solve $A\,\boldsymbol{x}_2 = 7\,\boldsymbol{x}_2 $.
$\pmatrix{\;\;\;3 & -6 \\ -2 & \;\;\; 4}\pmatrix{x_1 \\ x_2} = 7\pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{3x_1 - 6x_2 = 7x_1 \\ -2x_1 + 4x_2 = 7x_2}$.
$\bigg \{ \array{-4x_1 - 6x_2 = 0 \\ -2x_1 - 3x_2 = 0}$.
Both these equations give $x_1 = - {\large\frac{3}{2}}x_2$. Let $x_2 = 2$, say, giving $x_1 = - 3$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_2 = 7$ is $\boldsymbol{x}_2 = \pmatrix{-3 \\ \;\;\;2}$.
To summarise we have:
$\lambda_1 = 0, \; \boldsymbol{x}_1 = \pmatrix{2 \\ 1}$; $\lambda_2 = 7, \; \boldsymbol{x}_2 = \pmatrix{-3 \\ \;\;\;2}$.
End of Example 92.2.2. Eigenvectors for repeated eigenvalues
Before we consider the case of repeated eigenvalues we need to introduce the concept of linear independence of vectors. If the eigenvalues of a matrix $A$ are distinct then the corresponding eigenvectors are guaranteed to be linearly independent. Roughly speaking a set of vectors is linearly independent if none of them can be written as a linear combination of the others. Linear independence will be an important concept later when we discuss diagonalisation of matrices.
The examples presented in this section so far have all involved matrices with distinct eigenvalues and so it has been possible to find a full set of linearly independent eigenvectors in each case, i.e. eigenvectors that are not multiples of each other. However, as we demonstrate below when we have repeated eigenvalues it will not always be possible to find a full set of linearly independent eigenvectors. First of all we look at a case where we can find linearly independent eigenvectors.
Example 10
Find the eigenvalues and eigenvectors of the matrix, $A = \pmatrix{3 & 0 \\ 0 & 3}$.
Solution
The matrix $A$ is a diagonal matrix so that its eigenvalues are the entries on the main diagonal. Hence, $A$ has one repeated eigenvalue, i.e. $\lambda = 3$. We now try to find two linearly independent eigenvectors for this eigenvalue. Solve $A\pmb{x} = 3\pmb{x}$ , i.e.
$\pmatrix{3 & 0 \\ 0 & 3}\pmatrix{x_1 \\x_2} = 3\pmatrix{x_1 \\x_2}$
$\Bigg \{ \matrix{3x_1 + 0x_2 = 3x_1 \\ 0x_1 + 3x_2 = 3x_2}$
$\Bigg \{ \matrix{0x_1 + 0x_2 = 0 \\ 0x_1 + 0x_2 = 0}$
Neither of these equations place any restrictions on $x_1$ or $x_2$ and so they can take any values we choose, say $x_1 = \alpha$ and $x_2 = \beta$ . An eigenvector corresponding to the repeated eigenvalue $\lambda = 3$ will therefore have the form, $\pmb{x} = \pmatrix{\alpha \\ \beta}$. We can now obtain two linearly independent eigenvectors through suitable choices for $\alpha$ and $\beta$. The most obvious choices are:
- $\alpha = 1$, $\beta = 0$ giving the eigenvector, $\pmb{x}_1 = \pmatrix{1 \\ 0}$ and
- $\alpha = 0$, $\beta = 1$ giving the eigenvector, $\pmb{x}_2 = \pmatrix{0 \\ 1}$.
We have therefore been able to find two linearly independent eigenvectors for the repeated eigenvalue $\lambda = 3$.
End of Example 10Example 11
Find the eigenvalues and eigenvectors of the matrix, $A = \pmatrix{3 & -9 \\ 1 & \;\;\,9}$.
Solution
In Example 4 we found that the matrix has one repeated eigenvalue, i.e. $\lambda = 6$. We now try to find two linearly independent eigenvectors for this eigenvalue. Solve $A\pmb{x} = 6\pmb{x}$, i.e.
$\pmatrix{3 & -9 \\ 1 & \;\;\,9}\pmatrix{x_1 \\x_2} = 6\pmatrix{x_1 \\x_2}$
$\Bigg \{ \matrix{3x_1 - 9x_2 = 6x_1 \\ \;x_1 + 9x_2 = 6x_2}$
$\Bigg \{ \matrix{-3x_1 - 9x_2 = 0 \\ \;\;\;\,x_1 + 3x_2 = 0}$
Both these equations give $x_1 = -3x_2$. Let $x_2 = \alpha$, $(\alpha \neq 0 )$ giving $x_1 = -3\alpha$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda = 6$ will be of the form $\pmb{x} = \pmatrix{-3 \\ \;\;\;1}\alpha$.
No matter what choice is made for $\alpha$ the resulting eigenvectors will be scalar multiples of each other. We have therefore been unable to find two linearly independent eigenvectors for the eigenvalue $\lambda = 6$.
End of Example 112.2.3. Eigenvectors for complex eigenvalues
In Section 2.1.4 we saw that if a matrix $A$ with real entries has a complex eigenvalue $\lambda$ then we know that its complex conjugate $\bar{\lambda}$ is also an eigenvalue of $A$. Furthermore, it can be shown that if $\pmb{x}$ is an eigenvector corresponding to $\lambda$ then its complex conjugate $\bar{\pmb{x}}$, formed by taking the complex conjugates of the entries of $\pmb{x}$, is an eigenvector corresponding to $\bar{\lambda}$.
Example 12
Find the eigenvectors of the matrix, $A = \pmatrix{0 & -1 \\ 1 & \;\;\;0}$.
Solution
In Example 6(i) we found that $A$ had complex eigenvalues, $\lambda_1 = j$ and $\lambda_2 = - j$.
We now calculate the eigenvectors of the matrix $A$
Case 1: To find an eigenvector $\boldsymbol{x}_1$ corresponding to eigenvalue $\lambda_1 = j$ we solve
$A\,\boldsymbol{x}_1 = \lambda_1\,\boldsymbol{x}_1$.
$\pmatrix{0 & -1 \\ 1 & \;\;\; 0}\pmatrix{x_1 \\ x_2} = j\pmatrix{x_1 \\ x_2}$
$\bigg \{ \array{-x_2 = j\,x_1 \\\;\; x_1 = j\,x_2}$.
If, for example, we multiply the first equation by $j$ both equations give $x_1 = j\,x_2$.
Let $x_2 = 1$, say, then $x_1 = j$.
An eigenvector corresponding to the eigenvalue $\lambda_1 = j$ will then be $\boldsymbol{x}_1 = \pmatrix{j \\ 1}$.
Case 2: To find an eigenvector $\pmb{x_2}$ corresponding to eigenvalue $\lambda_2 = - j$ simply take the complex conjugates of the entries of $\boldsymbol{x}_1$ giving, $\boldsymbol{x}_2 = \pmatrix{-j \\ \;\;\;1}$.
To summarise we have:
$\lambda_1 = j, \; \boldsymbol{x}_1 = \pmatrix{j \\ 1}$; $\lambda_2 = -j, \; \boldsymbol{x}_2 = \pmatrix{-j \\ \;\;\;1}$.
End of Example 12 -
3. Eigenvalues and eigenvectors of a $3 \times 3$ matrix
We now extend the methods presented in the previous section to calculation of eigenvalues and eigenvectors of $3 \times 3$ matrices. We shall only consider the case of real distinct eigenvalues but note that, as for $2 \times 2$ matrices, we can have eigenvalues that are repeated or complex.
Example 13
Determine the eigenvalues and eigenvectors of the matrix, $A = \pmatrix{\;\;\;1 & 0 & 2 \\ -7 & 2 & 4 \\ \;\;\;8 & 0 & 1}$
Solution
To calculate the eigenvalues we need to solve
\begin{align} \text{det}(A - \lambda I) = \left| \begin{matrix} 1-\lambda & 0 & 2 \\ -7 & 2-\lambda & 4 \\ 8 & 0 & 1-\lambda \end{matrix} \right| = 0 \end{align}.
Here we will expand down Column 2 as it is the row/column with most zeroes.
$(2 - \lambda) \bigg | \matrix{1-\lambda & 2 \\ 8 & 1-\lambda} \bigg | = 0$
$(2 - \lambda)[(1 - \lambda)(1-\lambda)-16] = 0$
$(2-\lambda)[\lambda^2 - 2\lambda - 15] = 0$
$(2 - \lambda)(\lambda - 5)(\lambda + 3) = 0$
Hence, $\lambda_1 = 2, \; \lambda_2 = 5$ and $\lambda_3 = -3$ are the eigenvalues of the matrix $A$.
We now calculate the eigenvectors corresponding to the eigenvalues.
Case 1: For an eigenvector $\boldsymbol{x}_1$, corresponding to eigenvalue $\lambda_1 = 2$, we solve $A\, \boldsymbol{x}_1 = 2\,\boldsymbol{x}_1$, i.e.
$\pmatrix{\;\;\;1 & 0 & 2 \\ -7 & 2 & 4 \\ \;\;\;8 & 0 & 1}\pmatrix{x_1 \\ x_2 \\ x_3} = 2 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Bigg \{ \matrix{\;x_1 \;\;\;\;\;\; &+& 2x_3 = 2x_1 \\ -7x_1 + 2x_2 &+& 4x_3 = 2x_2 \\ 8x_1 \;\;\;\;\;\; &+& \;\;x_3 = 2x_3}$
$\Bigg \{ \matrix{\;-x_1\;\;\; & + & 2x_3 = 0 \\ -7x_1 & + & 4x_3 = 0 \\ \;\;\,8x_1 & - & \;\;x_3 = 0}$.
From the three equations the only possibility is that $x_1 = x_3 = 0$ . We can choose $x_2$ to have any value, $\alpha \neq 0$. Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = 2$ is of the form $ \boldsymbol{x}_1 = (0, \; \alpha, \; 0)^T $. For example, letting $\alpha = 1$ gives $\boldsymbol{x}_1 = (0, \; 1, \; 0)^T$.
Case 2: For an eigenvector $\boldsymbol{x}_2$, corresponding to eigenvalue $\lambda_2 = 5$, we solve $A\,\boldsymbol{x}_2 = 5\,\boldsymbol{x}_2$, i.e.
$\pmatrix{\;\;\;1 & 0 & 2 \\ -7 & 2 & 4 \\ \;\;\;8 & 0 & 1}\pmatrix{x_1 \\ x_2 \\ x_3} = 5 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Bigg \{ \matrix{\;x_1 \;\;\;\;\;\; &+& 2x_3 = 5x_1 \\ -7x_1 + 2x_2 &+& 4x_3 = 5x_2 \\ 8x_1 \;\;\;\;\;\; &+& \;\;x_3 = 5x_3}$
$\Bigg \{ \matrix{-4x_1 \;\;\;\;\;\;\;\;\;\;& + & 2x_3 = 0 \;.........(1) \\ -7x_1 - 3x_2 & + & 4x_3 = 0\; .........(2) \\ \;\;\;8x_1 \;\;\;\;\;\;\;\;\;\; & - & 4x_3 = 0 \; ..........(3)}$.
Equations (1) and (3) both say that $x_3 = 2x_1$. Set $x_1 = \alpha \;\,(\alpha \neq 0)$ to obtain $x_3 = 2\alpha$. Substituting these values in Eq. (2) gives, $-7\alpha - 3x_2 + 8\alpha = 0 \Rightarrow 3x_2 = \alpha \Rightarrow x_2 = \alpha / 3$. Hence, an eigenvector corresponding to the eigenvalue $\lambda_2 = 5$ is of the form $\boldsymbol{x}_2 = (\alpha, \; \alpha/3, \; 2\alpha)^T$. Choosing $\alpha = 3$ gives $\boldsymbol{x}_2 = (3, \; 1, \; 6)^T$.
Case 3: For an eigenvector $\boldsymbol{x}_3$, corresponding to eigenvalue $\lambda_3 = - 3$, solve $A\,\boldsymbol{x}_3 = - 3\, \boldsymbol{x}_3$, i.e.
$\pmatrix{\;\;\;1 & 0 & 2 \\ -7 & 2 & 4 \\ \;\;\;8 & 0 & 1}\pmatrix{x_1 \\ x_2 \\ x_3} = -3 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Bigg \{ \matrix{\;x_1 \;\;\;\;\;\; &+& 2x_3 = -3x_1 \\ -7x_1 + 2x_2 &+& 4x_3 = -3x_2 \\ 8x_1 \;\;\;\;\;\; &+& \;\;x_3 = -3x_3}$
$\Bigg \{ \matrix{4x_1\;\;\;\;\;\;\; & + & 2x_3 = 0 \; .........(1) \\ -7x_1 + 5x_2 & + & 4x_3 = 0\; .........(2) \\ 8x_1 \;\;\;\;\;\;\;& + & 4x_3 = 0\; .........(3)}$.
Equations (1) and (3) both say that $x_3 = -2x_1$. Set $x_1 = \alpha \;\, (\alpha \neq 0)$ to obtain $x_3 = -2\alpha$. Substituting these values in Eq. (2) gives, $-7\alpha + 5x_2 - 8\alpha = 0 \Rightarrow 5x_2 = 15\alpha \Rightarrow x_2 = 3\alpha$. Hence, an eigenvector corresponding to the eigenvalue $\lambda_3 = -3$ is of the form $\boldsymbol{x}_3 = (\alpha, \; 3\alpha, \; -2\alpha)^T$. Choosing $\alpha = 1$ gives $\boldsymbol{x}_3 = (1, \; 3, \; -2)^T$.
In summary, we therefore have,
$\lambda_1 = 2, \; \boldsymbol{x}_1 = \pmatrix{0 \\ 1 \\ 0}$; $\lambda_2 = 5, \; \boldsymbol{x}_2 = \pmatrix{3 \\ 1 \\ 6}$;$\lambda_3 = -3, \; \boldsymbol{x}_3 = \pmatrix{\;\;\;1 \\ \;\;\;3 \\ -2}$.
End of Example 13 -
4. Some properties of eigenvalues and eigenvectors
- Let $A$ be a real $n \times n$ matrix.
- $A$ will have exactly $n$ eigenvalues which may be repeated and will be real or occur in complex conjugate pairs.
- An eigenvalue can be zero but an eigenvector cannot be the zero vector, $\mathbf{0}$.
- The sum of the eigenvalues of $A$ equals the sum of the main diagonal entries of $A$, i.e. the trace of $A$.
- The product of the eigenvalues of $A$ equals the determinant of $A$.
- If $0$ is an eigenvalue of $A$ then $A$ is not invertible.
- If $\lambda$ is an eigenvalue of an invertible matrix $A$, with $\boldsymbol{x}$ as a corresponding eigenvector, then ${\Large\frac{1}{\lambda}}$ is an eigenvalue of $A^{-1}$, again with $\boldsymbol{x}$ as a corresponding eigenvector.
- If $\lambda$ is an eigenvalue of $A$, with $\boldsymbol{x}$ as a corresponding eigenvector, then $\lambda^k$ is an eigenvalue of $A^k$, again with $\boldsymbol{x}$ as a corresponding eigenvector, for any positive integer $k$.
- The matrix $A$ and its transpose, $A^T$, have the same eigenvalues but there is no simple relationship between their eigenvectors.
Procedure for calculating eigenvalues and eigenvectors
To calculate the eigenvalues and eigenvectors of a $n \times n$ matrix $A$ we proceed as follows:
1. Calculate the determinant of the matrix $A - \lambda I$, it will be a polynomial in $\lambda$ of degree $n$.
2. Find the roots of the polynomial by solving $\text{det}(A - \lambda I) = 0$. The $n$ roots of the polynomial are the eigenvalues of the matrix $A$.
3. For each eigenvalue, $\lambda$, solve $A \boldsymbol{x} = \lambda\boldsymbol{x}$ to find an eigenvector $\boldsymbol{x}$.
-
5. Diagonalisation of matrices
Eigenvalues and eigenvectors play an important role in solving systems of ordinary differential equations (ODEs). A system of coupled ODEs can be transformed into a set of independent, uncoupled equations by diagonalising the system matrix using a similarity transformation. We can then apply techniques such as the integrating factor method or Laplace transforms to solve each of the resulting ODEs. The determination of eigenvalues and eigenvectors, through matrix diagonalisation, also features prominently in the stability analysis of control systems.
5.1. Introduction
Consider a $n \times n$ matrix $A$ having $n$, not necessarily distinct, eigenvalues $\lambda_1, \lambda_2, \dots, \lambda_n$ with $n$ corresponding eigenvectors $\pmb{x_1, x_2, \dots,x_n}$ that are linearly independent (see item (i) below). We say that $A$ is diagonalisable if there exists a $n \times n$ invertible matrix $P$ and a $n \times n$ diagonal matrix $D$ such that
$P^{-1}A\,P=D$.
The columns of $P$ are the eigenvectors of $A$ and the diagonal entries of $D$ are the corresponding eigenvalues. Note that we can write the columns of $P$ in any order provided the components of $D$ are written in the same order.
The diagonal matrix $D$ will have the same eigenvalues, $\lambda_1, \lambda_2, \dots, \lambda_n$ , as the matrix $A$ and so $D$ and $A$ are called similar matrices.
Notes
(i). A set of eigenvectors is linearly independent if no one eigenvector in the set can be written as a inear combination of the other eigenvectors in the set.
(ii). A $n \times n$ matrix $A$ is guaranteed to be diagonalisable if:
- all its eigenvalues are real and distinct so that the corresponding eigenvectors (columns of $P$) are linearly independent or,
- $A$ is a symmetric matrix (even if it has repeated eigenvalues).
(iii). A matrix $A$ can have repeated eigenvalues but still be diagonalisable.
5.2. Matrices with distinct eigenvalues
In this section we illustrate diagonalisation of matrices with distinct eigenvalues by means of examples.
Example 14
(i). Calculate the eigenvalues and eigenvectors of the matrix
$A = \pmatrix{-4 & -6 \\ \,\,\,\,3 & \,\,\,\,5}$.
(ii). Determine an invertible matrix $P$ and a diagonal matrix $D$ such that such that $P^{-1}A\,P=D$ .
Solution
(i). First we find the eigenvalues by solving,
$\text{det}(A - \lambda I) = \bigg | \matrix{-4-\lambda & -6 \\3 & 5-\lambda} \bigg | = 0$
$\Rightarrow (-4 - \lambda)(5 - \lambda) + 18 = 0$
$\Rightarrow \lambda^2 - \lambda - 2 = 0$
$\Rightarrow (\lambda - 2)(\lambda + 1) = 0$
$\Rightarrow \lambda_1 = 2,\; \lambda_2 = -1.$
As the eigenvalues of the matrix $A$ are real and distinct $A$ is diagonalisable.
We now calculate the eigenvectors corresponding to each of the eigenvalues.
- For $\lambda_1 = 2$ we have $A\pmb{x}_1 = \lambda_1 \pmb{x}_1$ and so we need to solve,
$\pmatrix{-4 & -6 \\ \;\,\,3 & \;\,\,5}\pmatrix{x_1 \\ x_2} = 2.\pmatrix{x_1 \\ x_2}$
$\Rightarrow \bigg \{ \matrix{-4x_1 - 6x_2 = 2x_1 \\ \;\;3x_1 + 5x_2 = 2x_2}$
$\Rightarrow \bigg \{ \matrix{-6x_1 - 6x_2 = 0 \\ \;\;\,3x_1 + 3x_2 = 0.}$
Both these equations give $x_1 = -x_2$ and if we let $x_2 = 1$ then $x_1 = -1$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = 2$ is $\pmb{x}_1 = \pmatrix{-1 \\ \;\;\,1}$ .
- For $\lambda_2 =\; ‒ 1$ we have $A\pmb{x}_2 = \lambda_2 \pmb{x}_2$ and so we need to solve,
$\pmatrix{-4 & -6 \\ \;\;\,3 & \;\;\,5}\pmatrix{x_1 \\ x_2} = -1.\pmatrix{x_1 \\ x_2}$
$\Rightarrow \bigg \{ \matrix{-4x_1 - 6x_2 = -x_1 \\ \;\;\,3x_1 + 5x_2 = -x_2}$
$\Rightarrow \bigg \{ \matrix{-3x_1 - 6x_2 = 0 \\ \;\;\,3x_1 + 6x_2 = 0}$
Both these equations give $x_1 = -2x_2$ and if we let $x_2 = 1$ then $x_1 = -2$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_2 = - 1$ is $\pmb{x}_2 = \pmatrix{-2 \\ \;\;\,1}$.
The eigenvalues and eigenvectors of $A$ are therefore,
$\lambda_1 = 2$, $\pmb{x}_1 = \pmatrix{-1 \\ \;\;\,1}$;$\lambda_2 = -1$,$\pmb{x}_2 = \pmatrix{-2 \\ \;\;\, 1}$.
(ii). Using the above results we can define the invertible matrix,
$P = \pmatrix{-1 & -2 \\ \;\;\,1 & \;\;\,1}$.
Now form the diagonal matrix $D$ by writing the eigenvalues on the main diagonal of $D$ in the same order the corresponding eigenvectors appear in $P$, i.e.
$D = \pmatrix{2 & \;\;\,0 \\ 0 & -1}$
We now check that $P^{-1} A P = D$ as required.
Since we have a $2 \times 2$ matrix we easily find that $P^{-1} = \pmatrix{\;\;\,1 & \;\;\,2 \\ -1 & -1}$ and so $P^{-1} A P = \pmatrix{\;\;\,1 & \;\;\,2 \\-1 & -1}\pmatrix{-4 & -6 \\ \;\;\,3 & \;\;\,5}\pmatrix{-1 & -2 \\ \;\;\,1 & \;\;\, 1} = \pmatrix{2 & 4 \\ 1 & 1}\pmatrix{-1 & -2 \\ \;\;\,1 & \;\;\,1} = \pmatrix{2 & \;\;\,0 \\ 0 & -1} = D$ as required.
End of Example 14Example 15
(i). Determine the eigenvalues and eigenvectors of the matrix,
$A = \pmatrix{\;-5 & 2 & 1 \\ \;-8 & 3 & 2 \\ -16 & 2 & 6}$.
(ii). Write down an invertible matrix $P$ and a diagonal matrix $D$ such that $P^{-1} A P = D$ .
Solution
The eigenvalues are calculated using,
$\text{det}(A - \lambda I) = \left | \matrix{-5-\lambda & 2 & 1 \\ -8 & 3-\lambda & 2 \\ -16 & 2 & 6-\lambda} \right | = 0$
$\Rightarrow (-5-\lambda) \left|\matrix{3 - \lambda & 2 \\ 2 & 6-\lambda}\right| - 2 \left | \matrix{-8 & 2 \\ -16 & 6-\lambda} \right| + 1 \left | \matrix{-8 & 3-\lambda \\ -16 & 2} \right| = 0$
$\Rightarrow (-5-\lambda)[(3-\lambda)(6-\lambda) - 4] - 2 [-8(6-\lambda) + 32] + 1 [-16 + 16(3 - \lambda)] = 0$
$\Rightarrow (-5-\lambda)[\lambda^2 - 9\lambda + 14] - 2 [8\lambda - 16] + [32 - 16\lambda] = 0$
$\Rightarrow (-5-\lambda)(\lambda-2)(\lambda-7) - 2[8(\lambda-2)] + 16(2 - \lambda) = 0$
$\Rightarrow (-5-\lambda)(\lambda-2)(\lambda-7) - 16(\lambda-2) - 16(\lambda-2) = 0$
$\Rightarrow (\lambda-2)(-5-\lambda)(\lambda-7) - 32(\lambda-2) = 0$
$\Rightarrow (\lambda-2)[(-5-\lambda)(\lambda-7)-32] = 0$
$\Rightarrow (\lambda-2)[-\lambda^2 + 2\lambda+ 3] = 0$
$\Rightarrow -(\lambda-2)[\lambda^2 - 2\lambda - 3] = 0$
$\Rightarrow -(\lambda-2)(\lambda-3)(\lambda+1) = 0$
$\Rightarrow \lambda_1 = 2, \;\lambda_2 = 3, \; \lambda_3 = -1$.
We now calculate the eigenvectors corresponding to each of the eigenvalues.
Case 1: For an eigenvector $\pmb{x}_1$, corresponding to eigenvalue $\lambda_1 = 2$ , we solve $A\pmb{x}_1 = 2\pmb{x}_1$ , i.e.
$\pmatrix{-5 & 2 & 1 \\ -8 & 3 & 2 \\ -16 & 2 & 6}\pmatrix{x_1 \\x_2 \\x_3} = 2 \pmatrix{x_1\\x_2\\x_3}$
$\Bigg\{ \array{-5x_1 + 2x_2 + x_3 = 2x_1 \\ -8x_1 + 3x_2 + 2x_3 = 2x_2 \\ -16x_1 + 2x_2 + 6x_3 = 2x_3}$
$\Bigg \{ \array{-7x_1 + 2x_2 + x_3 = 0 \dots\dots(1) \\-8x_1 + x_2 + 2x_3 = 0 \dots\dots(2) \\ -16x_1 + 2x_2 + 4x_3 = 0 \dots\dots(3)}$.
Now, (1) ‒ (3) gives $9x_1 - 3x_3 = 0 \Rightarrow x_3 = 3x_1$.
Then (2) gives $-8x_1 + x_2 + 6x_1 = 0 \Rightarrow -2x_1 + x_2 = 0 \Rightarrow x_2 = 2x_1$.
Let $x_1 = 1$ then $x_2 = 2$ and $x_3 = 3$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = 2$ is $\pmb{x}_1 = \pmatrix{1 \\ 2 \\ 3}$.
Case 2: For an eigenvector $\pmb{x}_2$ , corresponding to eigenvalue $\lambda_2 = 3$ , we solve $A \pmb{x}_2 = 3 \pmb{x}_2$, i.e.
$\pmatrix{-5 & 2 & 1 \\ -8 & 3 & 2 \\ -16 & 2 & 6} \pmatrix{x_1 \\ x_2 \\x_3} = 3\pmatrix{x_1 \\ x_2 \\x_3}$
$\Bigg \{ \array{-5x_1 + 2x_2 + x_3 = 3x_1 \\ -8x_1 + 3x_2 + 2x_3 = 3x_2 \\ -16x_1 + 2x_2 + 6x_3 = 3x_3}$
$\Bigg \{ \array{-8x_1 + 2x_2 + x_3 = 0 \dots\dots(1)\;\; \\-8x_1\;\;\;\;\;\;\;\;\;\;\;+ 2x_3 = 0 \dots\dots(2)\,\\-16x_1 + 2x_2 + 3x_3 = 0 \dots\dots(3)\;\;}$.
From Equation (2) we have that $x_3 = 4x_1$.
Also, (3) $- 2 \times$ (1) gives, $-2x_2 + x_3 = 0 \Rightarrow x_3 = 2x_2$.
Let $x_1 = 1$ then $x_3 = 4$ and $x_2 = 2$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_2 = 3$ is $\pmb{x}_2 = \pmatrix{1\\2\\4}$.
Case 3: For an eigenvector $\pmb{x}_3$, corresponding to eigenvalue $\lambda_3 = -1$, solve $A \pmb{x}_3 = ‒1 \pmb{x}_3 $, i.e.
$\pmatrix{-5 & 2 & 1 \\ -8 & 3 & 2 \\ -16 & 2 & 6} \pmatrix{x_1 \\ x_2 \\ x_3} = -1 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Bigg \{ \array{-5x_1 + 2x_2 + x_3 = -x_1 \\ -8x_1 + 3x_2 + 2x_3 = -x_2 \\ -16x_1 + 2x_2 + 6x_3 = -x_3}$
$\Bigg \{ \array{-4x_1 + 2x_2 + x_3 = 0 \dots\dots(1) \\ -8x_1 + 4x_2 + 2x_3 = 0 \dots\dots(2) \\ -16x_1 + 2x_2 + 7x_3 = 0 \dots\dots(3)}$.
Now, (1) ‒ (3) gives $12x_1 - 6x_3 = 0 \Rightarrow x_3 = 2x_1$.
Substitute in (2) to give, $-8x_1 + 4x_2 + 4x_1 = 0 \Rightarrow -4x_1 + 4x_2 = 0 \Rightarrow x_1 = x_2$.
Let $x_1 = 1$ then $x_2 = 1$ and $x_3 = 2$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_3 = -1$ is $\pmb{x}_3 = \pmatrix{1\\1\\2}$.
In summary, the eigenvalue/eigenvectors pairs are
$\lambda_1 = 2, \; \pmb{x}_1 = \pmatrix{1 \\ 2 \\ 3}$; $\lambda_2 = 3, \; \pmb{x}_2 = \pmatrix{1 \\ 2 \\ 4}$;$\lambda_3 = -1, \; \pmb{x}_3 = \pmatrix{1 \\ 1 \\ 2}$
(ii). As the eigenvalues of $A$ are distinct $A$ is diagonalisable and we can define
$P = \pmatrix{1 & 1 & 1 \\ 2 & 2 & 1 \\ 3 & 4 & 2}$ and $D = \pmatrix{2 & 0 & \;\;\,0 \\0 & 3 & \;\;\,0 \\ 0 & 0 & -1}$ so that $P^{-1} A P = D$.
As a check we could calculate $P^{-1}$ and form the matrix product $P^{-1} A P$ , but we would prefer not to have to determine $P^{-1}$ for the $3 \times 3$ matrix. Alternatively, we note that $P^{-1}A P = D$ is equivalent to writing $AP = PD$ and we show this latter relationship holds.
Here
$AP = PD = \pmatrix{2 & \;3 & -1 \\ 4 & \;6 & -1 \\6 & 12 & -2}$
thereby confirming that our calculations are correct.
End of Example 155.3. Matrices with repeated eigenvalues
A matrix with repeated eigenvalues may or may not be diagonalisable. If we are able to find a full set of linearly independent eigenvectors then we can diagonalise the matrix but if we are unable to the matrix cannot be diagonalised. We shall illustrate with examples.
Example 16
In Example 11 (Section 2.2.2) we saw that the matrix $A = \pmatrix{3 & -9 \\ 1 & \;\;\;9}$ has a repeated eigenvalue, $\lambda = 6$ and that all possible eigenvectors are scalar multiples of $\pmb{x} = \pmatrix{-3 \\\;\;\;1}$. It is therefore impossible to find two linearly independent eigenvectors. A consequence of this is that the matrix $P$ will have columns that are scalar multiples of each other meaning that the determinant of $P$ is zero so that the inverse does not exist. Hence, $A$ cannot be diagonalised.
End of Example 16Example 17
For the matrix $A = \pmatrix{1 & -3 & 3 \\ 0 & -5 & 6 \\ 0 & -3 & 4}$ we can show that the characteristic equation is
$(\lambda + 2)(\lambda - 1)^2 = 0$.
Hence, the eigenvalues are $\lambda_1 = -2$, $\lambda_2 = \lambda_3 = 1$.
We now calculate the eigenvectors noting that the eigenvalue $\lambda = 1$ is repeated.
Case 1: For eigenvector $\pmb{x}_1$, corresponding to eigenvalue $\lambda_1 = -2$, solve $A\pmb{x}_1 = -2\pmb{x}_1 $, i.e.
$\pmatrix{1 & -3 & 3 \\ 0 & -5 & 6 \\ 0 & -3 & 4}\pmatrix{x_1 \\ x_2 \\ x_3} = -2 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Rightarrow \Bigg \{ \array{x_1 - 3x_2 + 3x_3 = -2x_1 \\ \;\;\;-5x_2 \;+ 6x_3 = -2x_2 \\ \;\;\;-3x_2 \;+ 4x_3 = -2x_3}$
$\Rightarrow \Bigg \{ \array{3x_1 - 3x_2 + 3x_3 = 0 \dots\dots(1) \\ \;\;\;\;\;\;-3x_2 \;+ 6x_3 = 0 \dots \dots(2) \\ \;\;\;\;\;\;-3x_2 \;+ 6x_3 = 0\dots\dots(3)}$
Equations (2) and (3) both give $x_2 = 2x_3$.
Substitute in Eq. (1) giving,
$3x_1 - 6x_3 + 3x_3 = 0 \Rightarrow x_1 = x_3$.
Let $x_3 = 1 \Rightarrow x_2 = 2$, $x_1 = 1$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = -2$ is, $\pmb{x}_1 = \pmatrix{1 \\ 2 \\ 1}$.
Case 2: For the repeated eigenvalue $\lambda_1 = 1$ we try to find two linearly independent eigenvectors. Solve $A\pmb{x} = 1.\pmb{x}$, i.e.,
$\pmatrix{1 & -3 & 3 \\ 0 & -5 & 6 \\ 0 & -3 & 4}\pmatrix{x_1 \\ x_2 \\ x_3} = 1 \pmatrix{x_1 \\ x_2 \\ x_3}$
$\Bigg \{ \array{x_1 - 3x_2 + 3x_3 = x_1 \\ \;\;\;\;-5x_2 \;+ 6x_3 = x_2 \\ \;\;\;\;-3x_2 \;+ 4x_3 = x_3}$
$\Bigg \{ \array{-3x_2 + 3x_3 = 0 \dots\dots(1) \\ -6x_2 \;+ 6x_3 = 0 \dots \dots(2) \\ -3x_2 + 3x_3 = 0\dots\dots(3)}$
All three equations give $x_2 = x_3$ and so we can set $x_2 = x_3 = \alpha$, $(\alpha \neq 0 )$.
As the equations are all independent of $x_1$ it can take any value, say $x_1 = \beta$.
- An eigenvector corresponding to the repeated eigenvalue $\lambda = 1$ will therefore have the form, $\pmb{x} = \pmatrix{\beta \\ \alpha \\ \alpha}$. We can now obtain two linearly independent eigenvectors through suitable choices for $\alpha$ and $\beta$. The most obvious choices are:
- $\alpha = 1$, $\beta = 0$ giving the eigenvector, $\pmb{x}_2 = \pmatrix{0 \\ 1 \\ 1}$ and
- $\alpha = 0$, $\beta = 1$ giving the eigenvector, $\pmb{x}_3 = \pmatrix{1 \\ 0 \\ 0}$.
In summary, we have
$\lambda_1 = -2, \pmb{x}_1 = \pmatrix{1 \\ 2 \\ 1}$; $\lambda_2 = 1, \pmb{x}_2 = \pmatrix{0 \\ 1 \\ 1}$;$\lambda_3 = 1, \pmb{x}_3 = \pmatrix{1 \\ 0 \\ 0}$.
We have therefore found three linearly independent eigenvectors and so $A$ is diagonalisable with .
$P = \pmatrix{1 & 0 & 1 \\ 2 & 1 & 0 \\ 1 & 1 & 0}$ and $D = \pmatrix{-2 & 0 & 0 \\ \;\;0 & 1 & 0 \\ \;\;0 & 0 & 1}$ so that $P^{-1}A P = D$.
End of Example 17Summary of Diagonalisation
- To diagonalise a $n \times n$ matrix $A$:
- calculate the ($n$) eigenvalues of $A$.
- calculate an eigenvector corresponding to each eigenvalue (note that a repeated eigenvalue will need more than one eigenvector).
- define $P$ to be the $n \times n$ matrix that has the eigenvectors as its columns. If $P$ is invertible then the matrix $A$ is diagonalisable. Otherwise it is not diagonalisable.
- let $D$ be the diagonal matrix which has the eigenvalues of $A$ as its diagonal entries written in the same order as the corresponding eigenvectors appear in $P$.
-
6. Solving linear systems of ODEs by diagonalisation
In this section we look at how matrix methods can be applied to solve linear, homogeneous, constant coefficient systems of ordinary differential equations (ODEs). We shall consider the case where we are able to find a full set of linearly independent eigenvectors so that the coefficient matrix is guaranteed to be diagonalisable. The process of diagonalisation enables us to convert the original coupled system to a diagonal system that can easily be solved using methods we already know.
6.1. First order linear systems
The method is introduced by considering a system of two ODEs but can be extended to any number of equations.
Suppose $x_1(t)$ and $x_2(t)$ are unknown functions of $t$ and the rate of change of each function with respect to $t$ is a linear combination of both functions. That is
${\Large{\frac{dx_1}{dt}}}= a_{11}x_1 + a_{12}x_2$
${\Large{\frac{dx_2}{dt}}}= a_{21}x_1 + a_{22}x_2$
where $a_{11}$, $a_{12}$, $a_{21}$ and $a_{22}$ are known constants. The functions $x_1(t)$ and $x_2(t)$ cannot be found directly from these two differential equations since each equation contains the two dependent variables $x_1$ and $x_2$. These differential equations are said to be coupled. The process of diagonalisation will uncouple the differential equations to a system of equations which involves a single dependent variable in terms of $t$.
Our system of differential equations can be written in matrix form as $\dot{\pmb{x}} = A\pmb{x}$
where
$\pmb{x} = \pmatrix{x_1 \\ x_2}$, $\dot{\pmb{x}}= \pmatrix{\dot{x}_1 \\ \dot{x}_2}$and$A = \pmatrix{a_{11} & a_{12} \\ a_{21} & a_{22}}$.
Assume that the matrix $A$ can be diagonalised so that we can write $P^{-1}AP = D$, or equivalently $A = PDP^{-1}$. The matrix $P$ has the eigenvectors of $A$ as its columns and the diagonal matrix $D$ consists of the eigenvalues of $A$ along the main diagonal in the same order the corresponding eigenvectors appear in $P$.
Now introduce the change of variables, $u = P^{-1} \pmb{x}$ so that $\pmb{x} = P\pmb{u}$ where $\pmb{u} = \pmatrix{u_1(t) \\ u_2(t)}$.
The entries in the matrix $P$ are just constants and so differentiating both sides of $\pmb{x} = P\pmb{u}$ with respect to $t$ gives
$\dot{\pmb{x}} = {\Large{\frac{d}{dt}}}(P\pmb{u}) = P {\Large{\frac{d}{dt}}}(\pmb{u}) = P \dot{\pmb{u}}$.
We can now write our original system of ODEs, $\dot{\pmb{x}} = A\pmb{x}$, in terms of $\pmb{u}$ to produce
$P \dot{\pmb{u}} = PDP^{-1}P\pmb{u}$.
or
$P\dot{\pmb{u}} = PD\pmb{u}$. (since $P^{-1}P = I$)
Pre-multiply both sides of this equation by the inverse of $P$ giving
$P^{-1}P\dot{\pmb{u}} = P^{-1}PD\pmb{u}$
which simplifies to give the system, $\dot{\pmb{u}} = D\dot{\pmb{u}}$.
This system can be written out fully as $\pmatrix{\dot{u}_1 \\ \dot{u}_2} = \pmatrix{\lambda_1 & 0 \\ 0 & \lambda_2}\pmatrix{u_1 \\ u_2}$ and expanded to give
${\Large{\frac{du_1}{dt}}} = \lambda_1u_1$
${\Large{\frac{du_2}{dt}}} = \lambda_2u_2$
The process of diagonalisation has decoupled the system of differential equations, i.e. each differential equation now only contains one dependent variable and hence can be solved. Once the solutions for $\pmb{u}$ , i.e. $u_1$ and $u_2$, are obtained the solution of the original system is calculated from, $\pmb{x} = P\pmb{u}$.
Note: At no point in the process do we need to calculate the inverse of matrix $\pmb{P}$.
Following the above steps gives the general solution of the system which will contain arbitrary constants. If initial conditions are specified then these constants are evaluated in the same way as for a single ordinary differential equation.
This method is readily extended to systems with three, or more, coupled ODEs. It can also be applied to systems of second order ODEs.
Example 18
Determine the general solution of the coupled system of ordinary differential equations,
$\dot{x}_1 = -5x_1 + 2x_2 + x_3$
$\dot{x}_2 = -8x_1 + 3x_2 + 2x_3$
$\dot{x}_3 = -16x_1 + 2x_2 + 6x_3$.
Solution
Step 1: Write the system as a matrix equation.
Let $\dot{\pmb{x}} = \pmatrix{\dot{x}_1 \\ \dot{x}_2 \\ \dot{x}_3}, \; A = \pmatrix{-5 & 2 & 1 \\ -8 & 3 & 2 \\ -16 & 2 & 6}$ and $\pmb{x} = \pmatrix{x_1 \\ x_2 \\ x_3}$ so that the system may be written as the matrix equation, $\dot{\pmb{x}} = A \pmb{x}$, i.e. $\pmatrix{\dot{x}_1 \\ \dot{x}_2 \\ \dot{x}_3} = \pmatrix{-5 & 2 & 1 \\ -8 & 3 & 2 \\ -16 & 2 & 6}\pmatrix{x_1 \\ x_2 \\ x_3}$.
Step 2: Calculate the eigenvalues and eigenvectors of the coefficient matrix $A$.
The eigenvalues and eigenvectors of the matrix $A$ were previously found in Example 15 to be,$\lambda_1 = 2$, $\pmb{x}_1 = \pmatrix{1 \\ 2 \\ 3}$; $\lambda_2 = 3$, $\pmb{x}_2 = \pmatrix{1 \\ 2 \\ 4}$ $\lambda_3 = -1$, $\pmb{x}_3 = \pmatrix{1 \\ 1 \\ 2}$.
Step 3: Diagonalise the matrix $A$.
As the eigenvalues of $A$ are real and distinct we know that $A$ is diagonalisable with$P = \pmatrix{1 & 1 & 1 \\ 2 & 2 & 1 \\ 3 & 4 & 2}$ and $P^{-1}AP = \pmatrix{2 & 0 & \;\;\;0 \\ 0 & 3 & \;\;\;0 \\ 0 & 0 & -1} = D$.
Step 4: Determine the general solution of the system.
We saw above that by letting $\pmb{x} = P \pmb{u}$ the coupled system $( \dot{\pmb{x}} = A\pmb{x} )$ can be expressed in uncoupled form as $\dot{\pmb{u}} = D\pmb{u}$ which in this case gives
$\pmatrix{\dot{u}_1 \\ \dot{u}_2 \\ \dot{u}_3} = \pmatrix{2 & 0 & \;\;0 \\ 0 & 3 & \;\;0 \\ 0 & 0 & -1}\pmatrix{u_1 \\ u_2 \\ u_ 3}$.
Expanding this expression gives the three uncoupled differential equations
${\Large{\frac{du_1}{dt}}} = 2u_1$, ${\Large{\frac{du_2}{dt}}} = 3u_2$ and ${\Large{\frac{du_3}{dt}}} = -u_3$
The solutions of these separable first order differential equations (refer to notes on ODEs) are
$u_1 = C_1e^{2t}$, $u_2 = C_2e^{3t}$ and $u_3 = C_3e^{-t}$
where $C_1$, $C_2$ and $C_3$ are arbitrary constants.
The solution, in terms of the original variables, $x_1(t)$, $x_2(t)$ and $x_3(t)$ can now be found.
Since $\pmb{x} = P\pmb{u} $ we have, $\pmatrix{x_1 \\ x_2 \\ x_3} = \pmatrix{1 & 1 & 1 \\ 2 & 2 & 1 \\ 3 & 4 & 2}\pmatrix{u_1 \\ u_2 \\ u_3}$.
Multiplying out the above gives
$x_1(t) = u_1 + u_2 + u_3 = C_1e^{2t} + C_2e^{3t} + C_3e^{-t}$
$x_2(t) = 2u_1 + 2u_2 + u_3 = 2C_1e^{2t} + 2C_2e^{3t} + C_3e^{-t}$
$x_3(t) = 3u_1 + 4u_2 + 2u_3 = 3C_1e^{2t} + 4C_2e^{3t} + 2C_3e^{-t}$.
A neater way of writing the general solution (GS) is
$\class{outbox}{\pmatrix{x_1 \\ x_2 \\ x_3} = C_1\pmatrix{1 \\ 2 \\ 3}e^{2t} + C_2 \pmatrix{1 \\ 2 \\ 4}e^{3t} + C_3\pmatrix{1 \\ 1 \\ 2}e^{-t}}$.
Can you see how the general solution is connected to the eigenvalues and eigenvectors?
The eigenvalues and eigenvectors appear is the GS as follows
$\pmb{x}(t) = C_1 \pmb{x}_1e^{\lambda_1t} + C_2\pmb{x}_2e^{\lambda_2t} + C_3\pmb{x}_3e^{\lambda_3t}$.
Note: If initial conditions were specified we could determine the unknown constants $C_1$, $C_2$ and $C_3$ to obtain a particular solution.
Extending the above, the general solution of a linear system $\dot{\pmb{x}} = A\pmb{x}$, where the $n \times n$ coefficient matrix $A$ is diagonalisable, is given by
$\class{outbox}{\pmb{x}(t) = C_1 \pmb{x}_1e^{\lambda_1t} + C_2\pmb{x}_2e^{\lambda_2t} + C_3\pmb{x}_3e^{\lambda_3t} + \dots + C_n \pmb{x}_ne^{\lambda_nt}}$
where $\pmb{x}_i$ is the eigenvector associated with the real eigenvalue $\lambda_i$.
End of Example 18Example 19
A system of two coupled ordinary differential equations is given by
$\dot{x}_1 = -2x_1 + 5x_2$
$\dot{x}_2 = \;\;x_1 + 2x_2$.
Determine the particular solutions for $x_1(t)$ and $x_2(t)$ that satisfy the initial conditions, $x_1(0) = -1$, $x_2(0) = 5$.
Solution
Step 1: Write the system in matrix form, $\dot{\pmb{x}} = A\pmb{x}$, i.e.
$\pmatrix{\dot{x}_1 \\ \dot{x}_2} = \pmatrix{-2 & 5 \\ \;\;\;1 & 2}\pmatrix{x_1 \\ x_2}$
Step 2: Determine the eigenvalues by solving,
$\det(A - \lambda I) = \Bigg | \matrix{-2-\lambda & 5 \\ 1 & 2-\lambda} \Bigg| = 0$
$\Rightarrow (-2 -\lambda)(2 - \lambda) - 5 = 0$
$\Rightarrow \lambda^2 - 9 = 0$
$\Rightarrow (\lambda - 3)(\lambda + 3) = 0$
$\Rightarrow \lambda_1 = 3$, $\lambda_2 = -3$.
As the eigenvalues of the matrix $A$ are real and distinct $A$ is diagonalisable.
Step 3: Calculate the eigenvectors corresponding to each of the eigenvalues.
For $\lambda_1 = 3$ we have $A \pmb{x}_1 = \lambda_1 \pmb{x}_1$ and so we need to solve,
$\pmatrix{-2 & 5 \\ \;\;\;1 & 2}\pmatrix{x_1 \\ x_2} = 3.\pmatrix{x_1 \\ x_2}$
$\Rightarrow \Bigg \{ \array{-2x_1 + 5x_2 = 3x_1 \\ \;\;\;\;x_1 + 2x_2 = 3x_2}$
$\Rightarrow \Bigg \{ \array{-5x_1 + 5x_2 = 0 \\ x_1 - x_2 = 0}$
Both these equations give $x_1 = x_2$ and if we let $x_2 = 1$ then $x_1 = 1$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_1 = 3$ is $\pmb{x}_1 = \pmatrix{1 \\ 1}$.
- For $\lambda_2 = -3$ we have $A \pmb{x}_2 = \lambda_2 \pmb{x}_2$ and so we need to solve,
$\pmatrix{-2 & 5 \\ \;\;\;1 & 2}\pmatrix{x_1 \\ x_2} = -3.\pmatrix{x_1 \\ x_2}$
$\Rightarrow \Bigg \{ \array{-2x_1 + 5x_2 = -3x_1 \\ \;\;\;\;x_1 + 2x_2 = -3x_2}$
$\Rightarrow \Bigg \{ \array{x_1 + 5x_2 = 0 \\ x_1 + 5x_2 = 0}$
Both these equations give $x_1 = -5x_2$ and if we let $x_2 = 1$ then $x_1 = -5$.
Hence, an eigenvector corresponding to the eigenvalue $\lambda_2 = -3$ is $\pmb{x}_2 = \pmatrix{-5 \\ \;\;\;1}$.
The eigenvalues and eigenvectors of $A$ are therefore,
$\lambda_1 = 3$, $\pmb{x}_1 = \pmatrix{1 \\ 1}$;$\lambda_2 = -3$,$\pmb{x}_2 = \pmatrix{-5 \\ \;\;\;1}$.
Step 4: Diagonalise the matrix $A$.
Using the above results we can define the invertible matrix,
$P = \pmatrix{1 & -5 \\ 1 & \;\;\;1}$.
Now form the diagonal matrix $D$ by writing the eigenvalues on the main diagonal of $D$ in the same order the corresponding eigenvectors appear in $P$, i.e.
$P^{-1}AP = D = \pmatrix{3 & \;\;\;0 \\ 0 & -3}$.
Step 5: Determine the general solution of the system.
Let $\pmb{x} = P\pmb{u}$. Differentiating both sides with respect to $t$, gives, $\dot{\pmb{x}} = P \dot{\pmb{u}}$ (since $P$ is a constant matrix). The coupled system $\dot{\pmb{x}} = A \pmb{x}$ can then be written as
$ P\dot{\pmb{u}} = AP\pmb{u} \;\;\;\;\; \Rightarrow \dot{\pmb{u}} = P^{-1}AP \pmb{u} \;\;\;\;\; \Rightarrow \dot{\pmb{u}} = D \pmb{u}$.
Hence,
$\pmatrix{\dot{u}_1 \\ \dot{u}_2} = \pmatrix{3 & \;\;\;0 \\ 0 & -3}\pmatrix{u_1 \\ u_2}$.
Expanding this expression gives the two uncoupled first order separable differential equations
${\Large{\frac{du_1}{dt}}} = 3u_1$ and ${\Large{\frac{du_2}{dt}}} = -3u_2$
The solutions of these differential equations (refer to notes on ODEs) are
$u_1 = C_1e^{3t}$ and $u_2 = C_2e^{-3t}$
where $C_1$ and $C_2$ are arbitrary constants
The solution, in terms of the original variables, $x_1(t)$ and $x_2(t)$ can now be found.
Since $\pmb{x} = P\pmb{u}$ we have, $\pmatrix{x_1 \\ x_2} = \pmatrix{1 & -5 \\ 1 & \;\;\;1}\pmatrix{u_1 \\ u_2}$.
Expanding this expression, gives
$x_1 = u_1 - 5u_2$
$x_2 = u_1 + u_2$.
Now use $u_1 = C_1e^{3t}$ and $u_2 = C_2e^{-3t}$ to write down the general solution
$\class{outbox}{\array{x_1(t) = C_1e^{3t} - 5C_2e^{-3t} \\ \\ x_2(t) = C_1e^{3t} + C_2e^{-3t}}}$.
Step 6: Determine the particular solution that satisfies,$x_1(0) = -1$, $x_2(0) = 5$. Substituting these values into the general solution gives,
$x_1(0) = -1 \Rightarrow C_1 - 5C_2 = -1$
$x_2(0) = 5 \Rightarrow C_1 + C_2 = 5$.
Solve as simultaneous equations to obtain, $C_1 = 4$ and $C_2 = 1$.
The particular solution (PS) is therefore,
$\class{outbox}{\array{x_1(t) = 4e^{3t} - 5e^{-3t} \\ \\ x_2(t) = 4e^{3t} + e^{-3t}}}$.
Step 7 (OPTIONAL): Check the answers for $x_1(t)$ and $x_2(t)$.
The PS gave that $x_1(t) = 4e^{3t} - 5e^{-3t}$ and so differentiating gives,
$\dot{x}_1 = 12e^{3t} + 15e^{-3t}$. (1)
The original coupled system gave that
$\dot{x}_1 = -2x_1 + 5x_2$.
Substituting the expressions for $x_1(t)$ and $x_2(t)$ from the PS we have,
$\dot{x}_1 = -2(4e^{3t} - 5e^{-3t}) + 5(4e^{3t} + e^{-3t})$
$\Rightarrow \dot{x}_1 = 12e^{3t} + 15e^{-3t}$. (2)
Expressions ( 1 ) and ( 2 ) are identical thereby verifying our answer for $x_1(t)$.
Now verify the answer for $x_2(t)$.
The PS gave that $x_2(t) = 4e^{3t} + e^{-3t}$ and so differentiating gives,
$\dot{x}_2 = 12e^{3t} - 3e^{-3t}$. (3)
The coupled system gave that
$\dot{x}_2 = x_1 + 2x_2$.
Substituting the expressions for $x_1(t)$ and $x_2(t)$ from the PS we have,
$\dot{x}_2 = (4e^{3t} - 5e^{-3t}) + 2(4e^{3t} + e^{-3t})$
$\Rightarrow \dot{x}_2 = 12e^{3t} - 3e^{-3t}$. (4)
Expressions ( 3 ) and ( 4 ) are identical thereby verifying our answer for $x_2(t)$.
End of Example 19Example 20
Determine the general solution of the coupled system
$\dot{x}_1 = x_1 - 3x_2 + 3x_3$
$\dot{x}_2 = \;\;\;\;\,-5x_2 + 6x_3$
$\dot{x}_3 = \;\;\;\;\,-3x_2 + 4x_3$.
Solution
In matrix form the system is given by $\dot{\pmb{x}} = A\pmb{x}$, i.e.
$\pmatrix{\dot{x}_1 \\ \dot{x}_2 \\ \dot{x}_3} = \pmatrix{1 & -3 & 3 \\ 0 & -5 & 6 \\ 0 & -3 & 4}\pmatrix{x_1 \\ x_2 \\ x_3}$.
In Example 17 we found that the matrix has the following eigenvalues and eigenvectors where the eigenvalue $\lambda = 1$ is repeated
$\lambda_1 = -2$, $\pmb{x}_1 = \pmatrix{1 \\ 2 \\ 1};$ $\lambda_2 = 1$, $\pmb{x}_2 = \pmatrix{0 \\ 1 \\ 1}$;$\lambda_3 = 1$, $\pmb{x}_3 = \pmatrix{1 \\ 0 \\ 0}$;
As we were able to find three linearly independent eigenvectors the matrix, $A$, is diagonalisable and the general solution of the system is given by
$x(t) = C_1\pmb{x}_1e^{\lambda_1t} + C_2\pmb{x}_2e^{\lambda_2t} + C_3\pmb{x}_3e^{\lambda_3t}$
Substituting the eigenvalues and eigenvectors gives
$\class{outbox}{x(t) = C_1\pmatrix{1 \\ 2 \\ 1}e^{-2t} + C_2 \pmatrix{0 \\ 1 \\ 1}e^{t} + C_3 \pmatrix{1 \\ 0 \\ 0}e^t}$
which expands to
$\class{outbox}{\array{x_1(t) = C_1e^{-2t} + C_3e^t \\ \\ x_2(t) = 2C_1e^{-2t} + C_2e^t \\ \\ x_3(t) = C_1e^{-2t} + C_2e^t.}}$
End of Example 206.2. Second order linear systems
The method is introduced by considering a system of two ODEs but can be extended to any number of equations.
Example 21
Determine the general solution of the coupled system
$\ddot{x}_1 = -2x_1 + x_2$
$\ddot{x}_2 = x_1 - 2x_2$.
Solution
Step 1: Write the system in matrix form, $\ddot{x} = A\pmb{x}$, i.e.
$\pmatrix{\ddot{x}_1 \\ \ddot{x}_2} = \pmatrix{-2 & \;\;\;1 \\ \;\;\;1 & -2}\pmatrix{x_1 \\ x_2}$.
Step 2: The eigenvalues and eigenvectors of $A$ are found to be,
$\lambda_1 = -3$, $\pmb{x}_1 = \pmatrix{\;\;\;1 \\ -1}$;$\lambda_2 = -1$, $\pmb{x}_2 = \pmatrix{1 \\ 1}$.
Since $A$ has distinct eigenvalues it can be diagonalised.
Step 3: Diagonalise the matrix $A$.
Using the above results, $P = \pmatrix{\;\;\;1 & 1 \\ -1 & 1}$ and $P^{-1}AP = \pmatrix{-3 & \;\;\;0 \\ \;\;\; 0 & -1} = D$.
Step 4: Determine the general solution of the system.
Let $\pmb{x} = P\pmb{u}$. Differentiating twice gives, $\ddot{\pmb{x}} = P\ddot{\pmb{u}}$ (since $P$ is a constant matrix). The coupled system $\ddot{\pmb{x}} = A\pmb{x}$ can then be written as
$P\ddot{\pmb{u}} = AP\pmb{u} \;\;\;\;\;\Rightarrow \ddot{\pmb{u}} = P^{-1}AP\pmb{u} \;\;\;\;\;\Rightarrow \ddot{\pmb{u}} = D \pmb{u}$.
Hence,
$\pmatrix{\ddot{u}_1 \\ \ddot{u}_2} = \pmatrix{-3 & \;\;\; 0 \\ \;\;\; 0 & -1}\pmatrix{u_1 \\ u_2}$.
Expanding this expression gives the two uncoupled second order differential equations
${\Large{\frac{d^2u_1}{dt^2}}} = -3u_1$ and ${\Large{\frac{d^2u_2}{dt^2}}} = -u_2$.
The solutions of the differential equations
${\Large{\frac{d^2u_1}{dt^2}}} + 3u_1 = 0$ and ${\Large{\frac{d^2u_2}{dt^2}}} + u_2 = 0$
are
$u_1(t) = A_1 \sin(\sqrt{3}t) + B_1\cos(\sqrt{3}t)$ and
$u_2(t) = A_2\sin(t) + B_2\cos(t)$
where $A_1$, $B_1$, $A_2$ and $B_2$ are arbitrary constants.
Note: In each case the roots of the auxiliary equation are complex with real part zero so solutions are in terms of the trig functions - refer to notes on ODEs.
The solution in terms of the original variables, $x_1(t)$ and $x_2(t)$ can now be found.
Since $\pmb{x} = P\pmb{u}$ we have, $\pmatrix{x_1 \\ x_2} = \pmatrix{\;\;\; 1 & 1 \\ -1 & 1}\pmatrix{u_1 \\ u_2}$.
Expanding this matrix equation gives
$x_1 =\;\;\, u_1 + u_2$
$x_2 = -u_1 + u_2$.
From earlier, $u_1 = A_1 \sin(\sqrt{3}t) + B_1\cos(\sqrt{3}t)$ and $u_2 = A_2\sin(t) + B_2\cos(t)$. Hence, on substitution, the general solution is
$\class{outbox}{\array{x_1(t) = A_1\sin(\sqrt{3}t) + B_1\cos(\sqrt{3}t) + A_2\sin(t) + B_2\cos(t)\;\;\;\;\;\; \\ \\ x_2(t) = -A_1\sin(\sqrt{3}t) - B_1\cos(\sqrt{3}t) + A_2\sin(t) + B_2\cos(t).}}$
A neater way of writing out the general solution is
$\class{outbox}{\pmatrix{x_1 \\ x_2} = \big[A_1\sin(\sqrt{3}t) + B_1\cos(\sqrt{3}t)\big]\pmatrix{\;\;\; 1 \\ -1} + \big [A_2\sin(t) + B_2\cos(t)\big]\pmatrix{1 \\ 1}}$.
Can you see how the general solution is connected to the eigenvalues and eigenvectors?
The general solution of a linear system $\ddot{\pmb{x}} = A \pmb{x}$ where the $n \times n$ matrix $A$ has $n$ distinct negative eigenvalues $-\omega_1^2$, $-\omega_2^2$, $-\omega_3^2 , \dots , -\omega_n^2$, with associated real eigenvectors $\pmb{x}_1, \pmb{x}_2, \pmb{x}_3, \dots , \pmb{x}_n$, is given by
$$\class{outbox}{\pmb{x}(t) = \sum_{i=1}^{n}\big[A_i\sin(\omega_it) + B_i\cos(\omega_it)\big]\pmb{x_i}}$$
End of Example 21Example 22
Determine the particular solution of the coupled system
$\ddot{x}_1 = -2x_1 + x_2$
$\ddot{x}_2 = x_1 - 2x_2$
subject to the initial conditions $x_1(0) = 0$, $x_2(0) = 0$, $\dot{x}_1(0) = 1$, $\dot{x}_2(0) = 3$.
Solution
The general solution of the system was obtained in Example 21 to be
$x_1(t) = A_1\sin(\sqrt{3}t) + B_1\cos(\sqrt{3}t) + A_2\sin(t) + B_2\cos(t)$
$x_2(t) = -A_1\sin(\sqrt{3}t) - B_1\cos(\sqrt{3}t) + A_2\sin(t) + B_2\cos(t)$.
First apply the conditions, $x_1(0) = 0, x_2(0) = 0$:
$x_1(0) = 0 : A_1\sin(0) + B_1\cos(0) + A_2\sin(0) + B_2\cos(0) = 0 \Rightarrow B_1 + B_2 = 0$
$x_2(0) = 0 : -A_1\sin(0) - B_1\cos(0) + A_2\sin(0) + B_2\cos(0) = 0 \Rightarrow -B_1 + B_2 = 0$.
Solving these simultaneous equations gives $B_1 = B_2 = 0$.
Hence, with $B_1 = B_2 = 0$ the general solution becomes,
$x_1(t) = A_1\sin(\sqrt{3}t) + A_2\sin(t)$
$x_2(t) = -A_1\sin(\sqrt{3}t) + A_2\sin(t)$.
To apply the remaining conditions $\dot{x}_1(0) = 1$ and $\dot{x}_2(0) = 3$ we must first differentiate $x_1(t)$ and $x_2(t)$:
$\dot{x}_1(t) = \sqrt{3}A_1\cos(\sqrt{3}t) + A_2\cos(t) \Rightarrow \sqrt{3}A_1 + A_2 = 1$
$x_2(t) = - \sqrt{3}A_1\cos(\sqrt{3}t) + A_2\cos(t) \Rightarrow -\sqrt{3}A_1 + A_2 = 3$.
Solving these simultaneous equations gives, $A_1 = {\Large{\frac{1}{\sqrt{3}}}}$ and $A_2 = 2$.
We now have values $A_1 = {\Large{\frac{1}{\sqrt{3}}}}$, $A_2 = 2$, $B_1 = 0$ and $B_2 = 0$ which are substituted in the general solution to obtain the particular solution:
$\class{outbox}{\array{x_1(t) = {\Large{\frac{1}{\sqrt{3}}}}\sin(\sqrt{3}t) + 2\sin(t) \\ \\ x_2(t) = -{\Large{\frac{1}{\sqrt{3}}}}\sin(\sqrt{3}t) + 2\sin(t).}}$
A neater way of writing out the particular solution is,
$\class{outbox}{\pmatrix{x_1 \\ x_2} = {\Large{\frac{1}{\sqrt{3}}}}\sin(\sqrt{3}t)\pmatrix{\;\;\;1 \\ -1} + 2\sin(t)\pmatrix{1 \\ 1}.}$
End of Example 22Example 23
A system of two coupled linear ordinary differential equations is given by,
$\ddot{x}_1 = -7x_1 + 12x_2$
$\ddot{x}_2 = -4x_1 + 7x_2.$
(i). express the system in matrix form, $\dot{\pmb{x}} = A \pmb{x}$.
(ii). determine the eigenvalues and eigenvectors of the coefficient matrix, $A$.
(iii). write down an invertible matrix $P$ such that $P^{-1}AP = D$ where $D$ is a diagonal matrix. Write down the matrix $D$.
(iv). determine the general solution of the system.
Solution
(i). In matrix form the system is,
$\pmatrix{\ddot{x}_1 \\ \ddot{x}_2} = \pmatrix{-7 & 12 \\ -4 & \;\;7}\pmatrix{x_1 \\ x_2}$.
(ii). Exercise: Show that the matrix $A$ has the following eigenvalues and eigenvectors,
$\lambda_1 = -1, \;\;\;\;\;\pmb{x}_1 = \pmatrix{2 \\ 1}; \;\;\;\;\;\;\;\;\lambda_2 = 1, \;\;\;\;\;\;\pmb{x}_2 = \pmatrix{3 \\ 2}$.
(iii). $\;\;P = \pmatrix{2 & 3 \\ 1 & 2}, \;\;\;\;\;\;\;\;\; P^{-1}AP = D = \pmatrix{-1 & 0 \\ \;\;\;0 & 1}$.
(iv). Determine the general solution of the system.
Let $\pmb{x} = P\pmb{u}$. Differentiating twice gives, $\ddot{\pmb{x}} = P\ddot{\pmb{u}}$ (since $P$ is a constant matrix).
The coupled system $\ddot{\pmb{x}} = A\pmb{x}$ can then be written as$P \ddot{\pmb{u}} = AP\pmb{u} \;\;\;\;\;\; \Rightarrow \ddot{\pmb{u}} = P^{-1}AP \pmb{u} \;\;\;\;\;\; \ddot{\pmb{u}} = D \pmb{u}$.
Hence,
$\pmatrix{\ddot{u}_1 \\ \ddot{u}_2} = \pmatrix{-1 & 0 \\ \;\;\; 0 & 1}\pmatrix{u_1 \\ u_2}$
Expanding this expression gives the two uncoupled second order differential equations,
$\ddot{u}_1 + u_1 = 0$ and $\ddot{u}_2 - u_2 = 0$.
Solving these equations gives,
$u_1(t) = A_1\sin(t) + B_1\cos(t)$ and
$u_2(t) = A_2e^t + B_2e^{-t}$
where $A_1$, $B_1$, $A_2$ and $B_2$ are arbitrary constants.
The solution in terms of the original variables, $x_1(t)$ and $x_2(t)$ can now be found.
Since $\pmb{x} = P \pmb{u}$ we have, $\pmatrix{x_1 \\ x_2} = \pmatrix{2 & 3 \\ 1 & 2}\pmatrix{u_1 \\ u_2}$.
Expanding this matrix equation gives
$x_1 = 2u_1 + 3u_2$
$x_2 = \;\;u_1 + 2u_2$.
From earlier, $u_1(t) = A_1\sin(t) + B_1\cos(t)$ and $u_2(t) = A_2e^t + B_2e^{-t}$. Hence, on substitution, the general solution is
$\class{outbox}{\array{x_1(t) = 2A_1\sin(t) + 2B_1\cos(t) + 3A_2e^t + 3B_2e^{-t} \\ \\ x_2(t) = A_1\sin(t) + B_1\cos(t) + 2A_2e^t + 2B_2e^{-t}.}}$
End of Example 23In this section we firstly considered linear systems of differential equations where the coefficient matrix has distinct eigenvalues and is therefore diagonalisable. The method was then extended to look at the special case where the eigenvalues are repeated and we have been able to find a set of linearly independent eigenvectors and diagonalise the coefficient matrix. The diagonalisation process uncouples the differential equations so that a solution to the system can be obtained in a relatively traightforward manner. However, as we saw in Section 5.3 a matrix with repeated eigenvalues may or may not be diagonalisable and we might not be able to find a set of linearly independent eigenvectors. When this situation arises we are unable to apply the methods described above to solve the system of ODEs and we need to use generalised eigenvectors. The topic is not considered here.
-
Summary
- On completion of this unit you should be able to:
- calculate eigenvalues and eigenvectors for $2 \times 2$ and $3 \times 3$ matrices.
- where appropriate diagonalise $2 \times 2$ and $3 \times 3$ matrices.
- solve linear, homogeneous, constant coefficient systems of first and second order ordinary differential equations by diagonalisation.