Math 511: Linear Algebra
7.1 Eigenvalues and Eigenvectors
7.1.1 A Special Product¶
$$ \require{color} \definecolor{brightblue}{rgb}{.267, .298, .812} \definecolor{darkblue}{rgb}{0.0, 0.0, 1.0} \definecolor{palepink}{rgb}{1, .73, .8} \definecolor{softmagenta}{rgb}{.99,.34,.86} \definecolor{blueviolet}{rgb}{.537,.192,.937} \definecolor{jonquil}{rgb}{.949,.792,.098} \definecolor{shockingpink}{rgb}{1, 0, .741} \definecolor{royalblue}{rgb}{0, .341, .914} \definecolor{alien}{rgb}{.529,.914,.067} \definecolor{crimson}{rgb}{1, .094, .271} \def\ihat{\mathbf{\hat{\unicode{x0131}}}} \def\jhat{\mathbf{\hat{\unicode{x0237}}}} \def\khat{\mathrm{\hat{k}}} \def\tombstone{\unicode{x220E}} \def\contradiction{\unicode{x2A33}} $$
Consider the matrix
$$ A = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix} $$
and the vector
$$ \mathbf{x} = \left[\begin{array}{r} -1 \\ 1 \end{array}\right] $$
If we take the product $A\mathbf{x}$, something unusual happens
$$ A\mathbf{x} = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}\left[\begin{array}{r} -1 \\ 1 \end{array}\right] = \left[\begin{array}{r} -2 \\ 2 \end{array}\right] = 2\left[\begin{array}{r} -1 \\ 1 \end{array}\right] = 2\mathbf{x}$$
Multiplying this particular vector $\mathbf{x}$ by $A$ results in scaling the vector $\mathbf{x}$ by $2$. This relationship is not typical and will be the focus of this section.
Exercise 1¶
Try to find another vector $\mathbf{y}$ and value $\lambda$ where $A\mathbf{y} = \lambda\mathbf{y}$. Take a guess.
Check Your Guess
The vector is$$ \mathbf{y} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} $$
and the scaling factor $\lambda = 3$.
We call the value(s) $\lambda$ that have this property the eigenvalues of matrix $A$, and the special vectors for this product the eigenvectors. Watch the following video for a conceptual explanation of eigenvalues and eigenvectors, then we will work through the computational details.
Everyone should begine their study of eigenvalues and eigenvectors with a video from Grant Sanderson, Eigenvectors and Eigenvalues
7.1.2 Eigenvalues and Eigenvectors¶
Definition of Eigenvalue and Eigenvector¶
Let $A\in\mathbb{R}^{n\times n}$. A scalar $\lambda$ is an eigenvalue (or characteristic value) of $A$ if there is a nonzero vector $\mathbf{x}$ such that
$$ A\mathbf{x} = \lambda\mathbf{x} $$
The vector $\mathbf{x}$ is called an eigenvector (or characteristic vector) of $\lambda$.
To find the eigenvalues and eigenvectors (if they exist) of a matrix, it is necessary to first determine if a matrix has any eigenvalues. This means that we need for there to be one or more values of $\lambda$ such that
$$ A\mathbf{x} = \lambda\mathbf{x} $$
The value $\lambda$ is a scalar, so it is more convenient to think of the right hand side of this equation as $\lambda I \mathbf{x}$ where $I$ is the identity matrix. This allows to subtract $\lambda I \mathbf{x}$ from both sides and rewrite the equation in the form
$$ \left(A - \lambda I\right) \mathbf{x} = \mathbf{0} $$
Hence, we want the matrix product here to be zero. If $\mathbf{x} = \mathbf{0}$ it is true, but that is not interesting or helpful. Instead, we note that the matrix $A - \lambda I$ is square, so it will map a nonzero vector $\mathbf{x}$ to $\mathbf{0}$ if and only if
$$ \det\left(A - \lambda I\right) = 0 $$
If we write out the determinant, we obtain an $n^\text{th}$ degree polynomial in $\lambda$, referred to as the characteristic polynomial of $A$:
$$ p(\lambda) = \det\left(A - \lambda I\right) $$
The roots of this polynomial are the eigenvalues of $A$. There will be $n$ total roots (including multiplicity), and these roots may be complex valued.
Theorem 7.1.1¶
The following are all equivalent:
Let $A\in\mathbb{R}^{n\times n}$ and $\lambda$ be a scalar
(a) $\lambda$ is an eigenvalue of $A$
(b) $\left(A - \lambda I\right)\mathbf{x} = \mathbf{0}$ has a nontrivial solution
(c) $ N\!\left(A - \lambda I\right)\neq \left\{\mathbf{0}\right\}$
(d) $A - \lambda I$ is singular
(e) $\det\left(A - \lambda I\right) = 0$
Example 1¶
Let's use our initial example to show how the eigenvalues and eigenvectors are determined. We begin with the matrix
$$ A = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix} $$
from earlier and want to determine $\det\left(A - \lambda I\right) = 0$.
$$ \det\left(A - \lambda I\right) = \begin{vmatrix} 3 - \lambda & 1 \\ 0 & 2 - \lambda \end{vmatrix} = 0 $$
This matrix is triangular, so its determinant is the product of the diagonal elements. This gives the characteristic polynomial
$$ (3-\lambda)(2-\lambda) = 0 $$
whose roots are $2$ and $3$. The eigenvectors are determined by choosing a $\lambda$ and then solving
$$ \left(A-\lambda I\right)\mathbf{x} = \mathbf{0} $$
For $\lambda = 2$,¶
$$ \begin{align*} \begin{bmatrix} 3-2 & 1 \\ 0 & 2-2 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} &= \begin{bmatrix} 0 \\ 0 \end{bmatrix} \\ \\ \begin{bmatrix} 1 & 1 \\ 0 & 0 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} &= \begin{bmatrix} 0 \\ 0 \end{bmatrix} \end{align*} $$
We see that $x_2$ is a free variable and $x_1 = -x_2$, so any vector in the form
$$ \mathbf{x} = \alpha\left[\begin{array}{r} -1 \\ 1 \end{array}\right] $$
for $\alpha\in\mathbb{R}$ will be scaled by $2$ if multiplied by $A$.
For $\lambda = 3$,¶
$$ \begin{align*} \begin{bmatrix} 3-3 & 1 \\ 0 & 2-3 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} &= \begin{bmatrix} 0 \\ 0 \end{bmatrix} \\ \\ \begin{bmatrix} 0 & 1 \\ 0 & -1 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} &= \begin{bmatrix} 0 \\ 0 \end{bmatrix} \\ \\ \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}\begin{bmatrix} x_1 \\ x_2 \end{bmatrix} &= \begin{bmatrix} 0 \\ 0 \end{bmatrix} \end{align*} $$
This shows that $x_1$ is the only pivot variable and it has no dependence on $x_2$, so $x_2$ must be zero. That means vectors of the form
$$ \mathbf{y} = \beta\begin{bmatrix} 1 \\ 0 \end{bmatrix} $$
with $\beta\in\mathbb{R}$ are eigenvectors of $3$. This can be verified by choosing $\beta = 1$ and seeing that
$$ A\mathbf{y} = \begin{bmatrix} 3 & 1 \\ 0 & 2 \end{bmatrix}\begin{bmatrix} 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 3 \\ 0 \end{bmatrix} = 3 \begin{bmatrix} 1 \\ 0 \end{bmatrix} = 3\mathbf{y}$$
We have determined the eigenvalues and associated eigenvectors for the matrix $A$.
For each $\lambda$, the null space of $A-\lambda I$, $N\!\left(A - \lambda I\right)$, is called the eigenspace corresponding to $\lambda$. Sets $\left\{ \begin{bmatrix} -1\ \\ \ \ 1\ \end{bmatrix} \right\}$ and $\left\{ \begin{bmatrix}\ 1\ \\ \ 0\ \end{bmatrix} \right\}$ each form a basis for these subspaces for $\lambda = 2$ and $3$, respectively.
7.1.3 Computing Eigenvalues and Eigenvectors¶
Finding the eigenvalues of a matrix involves determining the $\left|A - \lambda I\right|$ and finding the values of $\lambda$ that make the determinant zero. As we know from previous sections, determinant computations are typically tedious and expensive for large matrices. Unfortunately, there is not a way "around" this problem for eigenvalue computations. We need $\left|A - \lambda I\right|$ and there are no convenient identities or properties for $\det(A+B)$. In addition, the sort of matrices formed by $A - \lambda I$ do not lend themselves to employing row operations to make the matrix triangular. Therefore, we will be using the inefficient technique of cofactor expansion to find the determinants, associated characteristic polynomials, and eigenvalues.
Example 2¶
Find the eigenvalues and eigenvectors of the matrix
$$ A = \left[ \begin{array}{rrr} 1 & -1 & 0 \\ -1 & 2 & -1 \\ 0 & -1 & 1 \end{array} \right] $$
We begin by solving
$$ \begin{align*} \det\left(A - \lambda I\right) &= \left| \begin{array}{ccc} 1 - \lambda & -1 & 0 \\ -1 & 2 - \lambda & -1 \\ 0 & -1 & 1 - \lambda \end{array} \right| \\ \\ &= (1 - \lambda)\left[(2-\lambda)(1-\lambda) - 1\right] - (-1)\left[(-1)(1-\lambda) + (0)(-1)\right] + 0 \\ \\ &= (1 - \lambda)\left[1 - 3\lambda + \lambda^2 \right] - (1 - \lambda) \\ \\ &= (1-\lambda)\left[\lambda^2 - 3\lambda \right] \\ \\ &= \lambda(1-\lambda)(\lambda - 3) \end{align*} $$
The roots of the characteristic polynomial are $\lambda = 0,1,3$. To find the associated eigenvectors, we set $\lambda$ equal to an eigenvalue, then determine the $N(A - \lambda I)$. For this example, we have three distinct eigenvalues for a $3\times 3$ matrix, so each eigenspace will be one-dimensional. In general, if an eigenvalue has multiplicity $n$, the nullity of $A-\lambda I$ is $n$. Each vector in a basis for $N(A - \lambda I)$ will be an eigenvector for that eigenvalue.
Set $\lambda_1 = 0$,¶
then $A\mathbf{x} = 0$ and
$$ \left[ \begin{array}{rrr|r} 1 & -1 & 0 & 0 \\ -1 & 2 & -1 & 0 \\ 0 & -1 & 1 & 0 \end{array} \right] \rightarrow \left[ \begin{array}{rrr|r} 1 & 0 & -1 & 0 \\ 0 & 1 & -1 & 0 \\ 0 & 0 & 0 & 0 \end{array} \right] $$
is the reduced row echelon form of the augmented matrix. Hence $x_3 = \alpha\in\mathbb{R}$ is a free variable and it follows that $x_1 = x_2 = x_3$. For convenience, we choose $x_3 = 1$ and thus
$$ \mathbf{x}_{\lambda_1} = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} $$
is the eigenvector for $\lambda_1 = 0$.
Setting $\lambda_2 = 1$,¶
$$ \left[ \begin{array}{rrr|r} 0 & -1 & 0 & 0 \\ -1 & 1 & -1 & 0 \\ 0 & -1 & 0 & 0 \end{array} \right] \rightarrow \left[ \begin{array}{rrr|r} 1 & 0 & 1 & 0 \\ 0 & 1 & 0 & 0 \\ 0 & 0 & 0 & 0 \end{array} \right] $$
So
$$ \mathbf{x}_{\lambda_2} = \left[\begin{array}{r} 1 \\ 0 \\ -1 \end{array} \right] $$
is the eigenvector for $\lambda_2 = 2$ since $x_2 = 0$ and $x_1 = -x_3$ where $x_3$ is a free parameter.
Lastly, we set $\lambda_3 = 3$¶
$$ \left[ \begin{array}{rrr|r} -2 & -1 & 0 & 0 \\ -1 & -1 & -1 & 0 \\ 0 & -1 & -2 & 0 \end{array} \right] \rightarrow \left[ \begin{array}{rrr|r} 1 & 0 & -1 & 0 \\ 0 & 1 & 2 & 0 \\ 0 & 0 & 0 & 0 \end{array} \right] $$
Once again, $x_3$ is a free parameter and $x_1 = x_3$, $x_2 = -2x_3$ so
$$ \mathbf{x}_{\lambda_3} = \left[\begin{array}{r} 1 \\ -2 \\ 1 \end{array}\right] $$
is our eigenvector for $\lambda_3 = 3$.
7.1.4 Complex Eigenvalues¶
As mentioned earlier, there is nothing preventing eigenvalues from being complex numbers, even in cases where the matrix $A\in\mathbb{R}^{n\times n}$. In these instances, the eigenvalues are still computed by finding the $n$ roots of the characteristic polynomial, which we know from the fundamental theorem of algebra is always possible.
Exercise 1¶
Let
$$ A = \left[\begin{array}{rr} 2 & 5 \\ -2 & 4 \end{array}\right] $$
Find the eigenvalues and the corresponding eigenvectors.
Follow Along
$$ \left|\begin{array}{cc} 2 - \lambda & 5 \\ -2 & 4 - \lambda \end{array}\right| = (2-\lambda)(4-\lambda) + 10 = \lambda^2 - 6\lambda + 18 $$
The characteristic polynomial's roots may be found by using either the quadratic formula or completing the square.
$$ \lambda^2 - 6\lambda + 18 = \lambda^2 - 6\lambda + 9 + 9 = (\lambda - 3)^2 + 9 = 0 $$
This implies the the eigenvalues are $\lambda = 3 \pm 3i$.
To find the eigenvectors, we determine $N(A-\lambda I)$ for each of these eigenvalues. Starting with $\lambda_1 = 3 + 3i$,
$$ \begin{align*} \left[ \begin{array}{cc|r} 2 - (3+3i) & 5 & 0 \\ -2 & 4 - (3+3i) & 0 \end{array} \right] &= \left[ \begin{array}{cc|r} -1 - 3i & 5 & 0 \\ -2 & 1 - 3i & 0 \end{array} \right] \begin{matrix} (-1 + 3i)R_1 \\ \ \end{matrix} \\ \\ &= \left[ \begin{array}{cc|r} 10 & 5(-1 + 3i) & 0 \\ -2 & 1 - 3i & 0 \end{array} \right] \begin{matrix} \frac{1}{5}R_1 \\ \ \end{matrix} \\ \\ &= \left[ \begin{array}{cc|r} 2 & -1 + 3i & 0 \\ -2 & 1 - 3i & 0 \end{array} \right] \begin{matrix} \ \\ \ \\ R_1 + R_2 \\ \ \end{matrix} \\ \\ &= \left[ \begin{array}{cc|r} 2 & -1 + 3i & 0 \\ 0 & 0 & 0 \end{array} \right]\end{align*}$$
The variable $x_1$ is a pivot variable and $x_2$ is free with $x_1 = \frac{1-3i}{2}x_2$. We choose $x_2 =2$ and obtain for the eigenvector corresponding to $\lambda_1 = 3 + 3i$
$$ \begin{bmatrix} 1 - 3i \\ 2 \end{bmatrix} $$
Since $\lambda_1$ and $\lambda_2$ are complex conjugates of one another, their associated eigenvectors will also be conjugates, so $\lambda_2 = 3- 3i$ will have the eigenvector
$$ \begin{bmatrix} 1 + 3i \\ 2 \end{bmatrix} $$
It is left to you to verify this by computing $A\mathbf{x} = \lambda_2\mathbf{x}$.
7.1.5 Properties of Matrices Related to Eigenvalues¶
The eigenvalues of a matrix are intrinsically linked with some of its fundamental properties. Let's investigate some of these properties, beginning with the determinant and trace of the matrix.
Products and Sums of Eigenvalues¶
We want to study the product and sum of a matrix's eigenvalues, and we'll begin with the characteristic polynomial $p(\lambda)$ of an $n\times n$ matrix $A$
$$ p(\lambda) = \det\left( A - \lambda I \right) = \begin{vmatrix} a_{11} - \lambda & a_{12} & \ldots & a_{1n} \\ a_{21} & a_{22} - \lambda & \ldots & a_{2n} \\ \vdots & \vdots & \ddots & \vdots \\ a_{n1} & a_{n2} & \ldots & a_{nn} - \lambda \end{vmatrix} $$
Now, let $\lambda_1,\lambda_2,\ldots,\lambda_n$ be the eigenvalues of $A$ and therefore be the roots of $p(\lambda)$. This, along with the fundamental theorem of algebra, give us two equivalent forms for the characteristic polynomial.
$$ \begin{align*} p(\lambda) &= (-1)^n\lambda^n + c_{n-1}\lambda^{n-1} + \ldots + c_2\lambda^2 + c_1\lambda + c_0 \\ \\ &= (-1)^n(\lambda - \lambda_n)(\lambda - \lambda_{n-1})\ldots (\lambda - \lambda_2)(\lambda - \lambda_1) \end{align*} $$
We know that $p(\lambda) = \det( A - \lambda I )$, so we may set $\lambda = 0$ to obtain
$$ \det(A) = \det(A-\lambda I) = p(0) = (-1)^n(0 - \lambda_n)(0 - \lambda_{n-1})\ldots (0 - \lambda_2)(0 - \lambda_1) = (-1)^n(-1)^n\lambda_n\lambda_{n-1}\ldots\lambda_2\lambda_1 $$
The last expression comes from the fact that there are $n$, $(0 - \lambda_k)$ factors in the polynomial.
$$ \det(A) = p(0) = \lambda_n\lambda_{n-1}\ldots\lambda_2\lambda_1 = \displaystyle\prod_{k=1}^n \lambda_k $$
Hence,
Theorem 7.1.2¶
The product of the eigenvalues of $A$ is the determinant of $A$
$$ \prod_{i=1}^n \lambda_i = \det(A) $$
Furthermore, the expression
$$ p(\lambda) = (-1)^n(\lambda - \lambda_n)(\lambda - \lambda_{n-1})\ldots (\lambda - \lambda_2)(\lambda - \lambda_1) $$
may be used to determine the sum of the eigenvalues. If we expand looking for the $\lambda^{n-1}$ terms, each of these will have a coefficient of $-\lambda_i$ for all $i=1,2,\ldots,n$. Therefore, the coefficient $c_{n-1}$ will be
$$ c_{n-1} = -\left(\lambda_1 + \lambda_2 +\ldots + \lambda_n\right) $$
The same is true if we do cofactor expansion of $\det(A - \lambda I)$. If we write out the product, the only term with powers of $\lambda$ exceeding $n-2$ will be
$$ (\lambda - a_{11})(\lambda - a_{22})\ldots(\lambda - a_{nn}) $$
and we can see that the coefficient of the $\lambda^{n-1}$ term of this polynomial follows the same pattern as above with
$$ c_{n-1} = -\left(a_{11} + a_{22} +\ldots + a_{n}\right) $$
and so we conclude
Theorem 7.1.3¶
The sum of the eigenvalues of $A$ is equal to the trace of $A$
$$ \sum_{i=1}^n \lambda_i = \sum_{i=1}^n a_{ii} = \operatorname{tr}(A) $$
where the trace of a matrix is defined to be the sum of its diagonal elements.
7.1.6 Similar Matrices¶
Theorem 7.1.4¶
Similar matrices have the same eigenvalues
Let $A$ and $B$ be $n\times n$ matrices. If $B$ is similar to $A$, then the two matrices have the same characteristic polynomial and thus the same the eigenvalues.
Proof:¶
Let $p_A (\lambda)$ and $p_B (\lambda)$ be the characteristic polynomials and matrices $A$ and $B$, respectively. If $B$ is similar to $A$, then there exists a nonsingular matrix such that $B = S^{-1}AS$. Therefore,
$$ \begin{align*} p_B (\lambda) &= \det (B - \lambda I) \\ \\ &= \det (S^{-1}AS - \lambda I) \\ \\ &= \det (S^{-1}AS - \lambda S^{-1}S) \\ \\ &= \det(S^{-1} (A - \lambda I)S) \\ \\ &= \det(S^{-1})\det(A - \lambda I)\det(S) \\ \\ &= \frac{1}{\det(S)}\det(S)\det(A - \lambda I) \\ \\ &= \det(A - \lambda I) \\ \\ &= p_A (\lambda) \end{align*} $$
So $A$ and $B$ have the same characteristic polynomial. Since the roots of this polynomial are the eigenvalues of the matrix, their eigenvalues are the same. $\tombstone$
7.1.7 Eigenvalues of Nonsingular Matrices¶
One of the many equivalent properties for a matrix being nonsingular is that it does not have zero as an eigenvalue.
Theorem 7.1.5¶
A matrix is nonsingular if and only if zero is not an eigenvalue
Let $A$ be an $n\times n$ matrix. It is nonsingular if and only if $\lambda = 0$ is not an eigenvalue of $A$.
Proof:¶
$(\Rightarrow)$ Let $A$ be a nonsingular matrix. Then $A - 0I$ is a nonsingular matrix, since $0I$ is the additive identity. By definition, $A - 0I$ must be singular for $\lambda = 0$ to be an eigenvalue of $A$, hence $\lambda = 0$ is not an eigenvalue of $A$.
$(\Leftarrow)$ Suppose $\lambda = 0$ is not an eigenvalue of $A$. The matrix $A - 0\lambda$ is nonsingular, so $A$ is nonsingular. $\tombstone$
7.1.8 Exercises¶
Exercise 2¶
Find the eigenvalues and bases for the eigenspaces for the following matrix
$$ A = \begin{bmatrix}\ 5\ &\ 1\ &\ 0\ \\ \ 1\ &\ 5\ &\ 0\ \\ \ 0\ &\ 0\ &\ 8\ \end{bmatrix} $$
Check Your Work
The eigenvalues are $8$, $6$ and $4$.
$$ \begin{align*} N(A-8I) &= \text{Span}\left\{ \begin{bmatrix}\ 0\ \\ \ 0\ \\ \ 1\ \end{bmatrix} \right\} \\ \\ N(A-6I) &= \text{Span}\left\{ \begin{bmatrix}\ 1\ \\ \ 1\ \\ \ 0\ \end{bmatrix} \right\} \\ \\ N(A-4I) &= \text{Span}\left\{ \begin{bmatrix} -1\ \\ \ \ 1\ \\ \ \ 0\ \end{bmatrix} \right\} \end{align*} $$
Follow Along
$$ \begin{align*} \det(A-\lambda I) &= \begin{vmatrix}\ 5 - \lambda\ &\ 1\ &\ 0\ \\ 1\ &\ 5 - \lambda\ &\ 0\ \\ \ 0\ &\ 0\ &\ 8-\lambda\ \end{vmatrix} \\ \\ &= 0 - 0 + (8-\lambda)\begin{vmatrix}\ 5 - \lambda\ &\ 1\ \\ \ 1\ &\ 5 - \lambda\ \end{vmatrix} \\ \\ &= (8 - \lambda)\left[ (5 - \lambda)^2 - 1 \right] \\ \\ &= (8 - \lambda)\left[ 25 - 10\lambda + \lambda^2 - 1 \right] \\ \\ &= (8 - \lambda)(\lambda^2 - 10\lambda + 24) \\ \\ &= -(\lambda - 8)(\lambda - 6)(\lambda - 4) \end{align*} $$
Set $\lambda_1 = 8$
$$ A - 8I = \begin{bmatrix} -3\ &\ \ 1\ &\ \ 0\ \\ \ \ 1\ & -3\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ \end{bmatrix}\begin{array}{l} R_2 \\ R_1+3R_2 \\ \\ \end{array}\rightarrow \begin{bmatrix}\ \ 1\ & -3\ &\ \ 0\ \\ \ \ 0\ & -8\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ \end{bmatrix}\begin{array}{l} \\ -\frac{1}{8}R_2 \\ \\ \end{array}\rightarrow \begin{bmatrix}\ \ 1\ & -3\ &\ \ 0\ \\ \ \ 0\ &\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ \end{bmatrix}\begin{array}{l} R_1+3R_2 \\ \\ \\ \end{array}\rightarrow \begin{bmatrix}\ \ 1\ &\ \ 0\ &\ \ 0\ \\ \ \ 0\ &\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ \end{bmatrix} $$
The eigenspace of eigenvalue $\lambda_1=8$ is $N(A-8I) = \text{Span}\left\{ \khat \right\}$.
Set $\lambda_2 = 6$
$$ \begin{align*} A - 6I &= \begin{bmatrix} -1\ &\ \ 1\ &\ \ 0\ \\ \ \ 1\ & -1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 2\ \end{bmatrix}\begin{array}{l} R_2 \\ R_1+R_2 \\ \frac{1}{2}R_3 \end{array}\rightarrow \begin{bmatrix}\ \ 1\ & -1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 1\ \end{bmatrix} \\ \\ x_3 &= 0 \\ x_2 &= s\in\mathbb{R} \\ x_1 - s &= 0 \\ x_1 &= s \end{align*} $$
The eigenspace of $\lambda_2 = 6$ is $N(A-6I) = \text{Span}\left\{\begin{bmatrix}\ 1\ \\ \ 1\ \\ \ 0\ \end{bmatrix} \right\}$.
Set $\lambda_3 = 4$
$$ \begin{align*} A - 4I &= \begin{bmatrix}\ 1\ &\ 1\ &\ 0\ \\ \ 1\ &\ 1\ &\ 0\ \\ \ 0\ &\ 0\ &\ 4\ \end{bmatrix}\begin{array}{l} \\ R_2-R_1 \\ \frac{1}{4}R_3 \end{array}\rightarrow\begin{bmatrix}\ 1\ &\ 1\ &\ 0\ \\ \ 0\ &\ 0\ &\ 0\ \\ \ 0\ &\ 0\ &\ 1\ \end{bmatrix} \\ x_3 &= 0 \\ x_2 &= s\in\mathbb{R}^3 \\ x_1 + s &= 0 \\ x_1 &= -s \end{align*} $$
The eigenspace of $\lambda_3=4$ is $N(A-4I) = \text{Span}\left\{\begin{bmatrix} -1\ \\ \ \ 1\ \\ \ \ 0 \end{bmatrix} \right\}$.
---
Your use of this self-initiated mediated course material is subject to our Creative Commons License 4.0