Math 511: Linear Algebra
2.3 Matrix Inverse
2.3.1 The Identity Matrix¶
$$ \require{color} \definecolor{brightblue}{rgb}{.267, .298, .812} \definecolor{darkblue}{rgb}{0.0, 0.0, 1.0} \definecolor{palepink}{rgb}{1, .73, .8} \definecolor{softmagenta}{rgb}{.99,.34,.86} \definecolor{blueviolet}{rgb}{.537,.192,.937} \definecolor{jonquil}{rgb}{.949,.792,.098} \definecolor{shockingpink}{rgb}{1, 0, .741} \definecolor{royalblue}{rgb}{0, .341, .914} \definecolor{alien}{rgb}{.529,.914,.067} \definecolor{crimson}{rgb}{1, .094, .271} \def\ihat{\mathbf{\hat{\unicode{x0131}}}} \def\jhat{\mathbf{\hat{\unicode{x0237}}}} \def\khat{\mathrm{\hat{k}}} \def\tombstone{\unicode{x220E}} \def\contradiction{\unicode{x2A33}} $$
The identity matrix $I_n$ represents the identity function $\mathscr{I}$ in $\mathbb{R}^{n\times n}$. For every vector $\mathbf{x}$ in the domain of function $\mathscr{I}$, $\mathscr{I}(\mathbf{x}) = \mathbf{x}$. Similarly,
$$ I_n\mathbf{x} = \mathbf{x} $$
If we compose the identity function $\mathscr{I}$ with some other $f$ we have $f\circ\mathscr{I} = \mathscr{I}\circ f = f$. For the $n\times n$ identity matrix
$$ AI_n = I_nA = A. $$
Moreover we may write the identity matrix as
$$ I_n = [\delta_{ij}] $$
In scalar algebra every nonzero scalar $a\neq 0$ has a multiplicative inverse $a^{-1} = \frac{1}{a}$ so that
$$ a\cdot a^{-1} = a^{-1}\cdot a = a\cdot\frac{1}{a} = \frac{1}{a}\cdot a = 1 $$
Question¶
Does every nonzero $n\times n$ matrix have a multiplicative inverse?
The answer is no. Matrices and the linear transformations (functions) they represent are more complicated than scalars. Unlike scalars, there are nonzero $n\times n$ matrices with no multiplicative inverse.
Example 2.3.1¶
$$ A = \begin{bmatrix}\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ \end{bmatrix} $$
This matrix has no multiplicative inverse. It is already in reduced row echelon form and it has a free column. Let us try to find a multiplicative inverse say
$$ B = \begin{bmatrix}\ \ a\ &\ \ b\ \\ \ \ c\ &\ \ d\ \end{bmatrix} $$
If $AB = I_2$, then
$$ \begin{bmatrix}\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ \end{bmatrix}\begin{bmatrix}\ \ a\ &\ \ b\ \\ \ \ c\ &\ \ d\ \end{bmatrix} = \begin{bmatrix}\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 1\ \end{bmatrix} $$
This gives us a linear system of four equations and four unknowns
$$ \begin{array}{rrrrrrrrcl} a & & & & + & 0c & & & = & 1 \\ 0a & & & & + & 0c & & & = & 0 \\ & & b & & + & & + & 0d & = & 0 \\ & & 0b & & + & & + & 0d & = & 1 \end{array} $$
The augmented matrix is given by
$$ \begin{align*} \begin{bmatrix} 1 & 0 & 0 & 0 & | & 1 \\ 0 & 0 & 0 & 0 &| & 0 \\ 0 & 1 & 0 & 0 & | & 0 \\ 0 & 0 & 0 & 0 & | & 1 \end{bmatrix} &\rightarrow\begin{bmatrix} 1 & 0 & 0 & 0 & | & 1 \\ 0 & 1 & 0 & 0 & | & 0 \\ 0 & 0 & 0 & 0 & | & 1 \\ 0 & 0 & 0 & 0 &| & 0 \end{bmatrix} \end{align*} $$
This linear system is inconsistent due to the third row so there is no matrix $B$ so that $AB = I_2$.
Definition¶
If an $n\times n$ matrix $A\in\mathbb{R}^{n\times n}$ has a multiplicative inverse $B$ so that
$$ AB = BA = I_n $$
then we say that matrix $A$ is nonsingular or invertible.
If matrix $A\in\mathbb{R}^{n\times n}$ has no multiplicative inverse we say that matrix $A$ is singular.
If matrix $A\in\mathbb{R}^{n\times n}$ is nonsingular, we denote the inverse of matrix $A$ by $A^{-1}$ using the exponent notation for multiplicative inverse.
We never put a matrix in the denominator of an algebraic expression.
Thus if $A$ is nonsingular, then matrix $A$ is invertible and we may write
$$
AA^{-1} = A^{-1}A = I_n
$$
Now we must develop some tools for determining when a square matrix is singular, and computing the inverse if it exists!
2.3.2 When is a Matrix Invertible?¶
Example 2.3.1 (again)¶
$$ A = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} $$
Is this matrix $A$ invertible? That is, does matrix $A$ have a multiplicative inverse; or is it nonsingular?
Notice that it is already in reduced row echelon form. How many pivot columns does it have? How many free columns?
Does the linear transformation represented by matrix $A$ squish two dimensional space onto a line?
To answer these questions let us consider where $\ihat$ and $\jhat$ are mapped by matrix $A$.
$$ \begin{align*} A\ihat &= \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}\begin{bmatrix} 1 \\ 0 \end{bmatrix} = 1\begin{bmatrix} 1 \\ 0 \end{bmatrix} + 0\begin{bmatrix} 0 \\ 0 \end{bmatrix} = \begin{bmatrix} 1 \\ 0 \end{bmatrix} = \ihat \\ \\ A\jhat &= \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}\begin{bmatrix} 0 \\ 1 \end{bmatrix} = 0\begin{bmatrix} 1 \\ 0 \end{bmatrix} + 1\begin{bmatrix} 0 \\ 0 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} = \mathbf{0} \\ \end{align*} $$
Every vector in $\mathbf{x}\in\mathbb{R}^2$
$$ \mathbf{x} = \begin{bmatrix} x_1 \\ x_2 \end{bmatrix} = x_1\ihat + x_2\jhat $$
gets mapped to
$$ \begin{align*} A\mathbf{x} &= A\left(x_1\ihat + x_2\jhat\right) \\ &= A\left(x_1\ihat\right) + A\left(x_2\jhat\right) \\ &= x_1A\ihat + x_2A\jhat \\ &= x_1\ihat + x_2\mathbf{0} \\ &= x_1\ihat \\ &= \begin{bmatrix} x_1 \\ 0 \end{bmatrix} \end{align*} $$
This tells us that all of the vectors on the $x_2$ axis
$$ A\begin{bmatrix} 0 \\ x_2 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \end{bmatrix} $$
get squished onto the zero vector. In fact all of two dimensional space gets squished onto the $x_1$ axis. There is no way to undo or invert this process because so many vectors all get squished onto the same vector on the $x_2$ axis. Matrix $A$ is singular; it has no inverse.
Can we tell that this matrix is singular just by looking at the reduced row echelon form of the matrix?
The answer is _yes__! This matrix has a free column. This tells us right away that the matrix is not invertible, or singular.
Example 2.3.2¶
$$ B = \begin{bmatrix}\ \ 1\ &\ \ 2\ & -1\ \\ \ \ 1\ &\ \ 1\ &\ \ 1\ \\ \ \ 2\ &\ \ 0\ &\ \ 1\ \end{bmatrix} $$
Is matrix $B$ nonsingular? Before looking at the solution attempt to answer this question yourself.
Follow Along
Let us reduce matrix $B$ into upper triangular form so that we can determine the existence of any free columns.
$$ \begin{align*} B &= \begin{bmatrix}\ \ 1\ &\ \ 2\ & -1\ \\ \ \ 1\ &\ \ 1\ &\ \ 1\ \\ \ \ 2\ &\ \ 0\ &\ \ 1 \end{bmatrix}\begin{array}{c} \ \\ R_2-R_1 \\ R_3 - 2R_1 \end{array} \longrightarrow \begin{bmatrix}\ \ 1\ &\ \ 2\ & -1\ \\ 0 & -1\ &\ \ 2\ \\ \ \ 0\ & -4\ &\ \ 3\ \end{bmatrix}\begin{array}{c} \ \\ \ \\ R_3 - 4R_2 \end{array} \\ \\ &\longrightarrow\begin{bmatrix}\ \ 1\ &\ \ 2\ & -1\ \\ \ \ 0\ & -1\ &\ \ 2\ \\ 0 &\ \ 0\ & -5\ \end{bmatrix} \end{align*} $$
All three columns are pivot columns. Hence matrix $B$ is nonsingular. That is matrix $B$ is invertible and we may write $B^{-1}$ for the multiplicative inverse of matrix $B$.
We have seen another application of reducing a matrix using row operations. This is an important skill that you must practice throughout the course.
2.3.3 Invertible Matrices¶
Now you can watch the rest of the video by Dr. Strang if you still need more help!
During this course we will create a list of properties a matrix may have and the vocabulary that we use to describe matrices with various properties. So far we have the following for nonsingular matrices:
Theorem 2.3.1¶
For square matrix $A\in\mathbb{R}^{n\times n}$ all of the following properties are equivalent:
- all of the columns of $A$ are pivot columns
- The homogeneous linear system $A\mathbf{x} = 0$ has a unique solution
- $A$ is a nonsingular matrix
- $A$ is a non-degenerate matrix
- $A$ is an invertible matrix
We can state these properties in the reverse direction.
Corollary 2.3.2¶
For square matrix $A\in\mathbb{R}^{n\times n}$ all of the following properties are equivalent:
- $A$ has a free column
- The homogeneous linear system $A\mathbf{x} = 0$ has a infinitely many solutions
- $A$ is a singular matrix
- $A$ is a degenerate matrix
- $A$ is not an invertible matrix
Example 2.3.3¶
A matrix is singular or degenerate if it does not have a multiplicative inverse. Unlike real numbers, there are nonzero singular matrices. The matrix
$$ A = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{bmatrix} $$
is singular. We will learn more ways to determine whether a square matrix is singular or nonsingular.
2.3.4 The Inverse of a Product¶
Theorem 2.3.3¶
If $A$ and $B$ are nonsingular $n\times n$ matrices, then the product $AB$ is also nonsingular.
We prove this by producing the multiplicative inverse of the matrix $AB$. If $AB(C) = (C)AB = I$, then both matrix $AB$ and $C$ have a multiplicative inverse. They are nonsingular.
Proof:¶
Since $A$ and $B$ are nonsingular they have multiplicative inverses, $A^{-1}$ and $B^{-1}$.
$$ \begin{align*} (AB)(B^{-1}A^{-1}) &= A(BB^{-1})A^{-1} & &\text{Matrix product is associative} \\ &= A(I)A^{-1} & &\text{$B$ and $B^{-1}$ are inverses} \\ &= (AI)A^{-1} & &\text{Matrix product is associative} \\ &= AA^{-1} & &\text{I is the matrix product identity} \\ &= I & &\text{$A$ and $A^{-1}$ are inverses} \end{align*} $$
Furthermore,
$$ \begin{align*} (B^{-1}A^{-1})(AB) &= B^{-1}(A^{-1}A)B & &\text{Matrix product is associative} \\ &= B^{-1}(I)B & &\text{$A^{-1}$ and $A$ are inverses} \\ &= B^{-1}(IB) & &\text{Matrix product is associative} \\ &= B^{-1}B & &\text{I is the matrix product identity} \\ &= I & &\text{$B^{-1}$ and $B$ are inverses} \end{align*} $$
So there is an $n\times n$ matrix $B^{-1}A^{-1}$ that is the multiplicative inverse of product matrix $AB$. Thus the product $AB$ is nonsingular and
$$
(AB)^{-1} = B^{-1}A^{-1}.
$$
$\tombstone$
Corollary 2.3.4¶
If $A_1$, $A_2$, $\dots$, $A^k$ are all nonsingular $n\times n$ matrices, then the product $A_1A_2\cdots A_k$ is a nonsingular matrix and
$$ \left(A_1A_2\cdots A_k\right)^{-1} = A_k^{-1}\cdots A_2^{-1}A_1^{-1}. $$
Notice the relationship between the inverse of a product and the product of the inverses.
2.3.5 The Inverse of a Transpose¶
Theorem 2.3.5¶
If matrix $A$ is nonsingular, then $A^T$ is nonsingular
Proof¶
Let us prove that is it by once again producing the inverse.
If matrix $A\in\mathbb{R}^{n\times n}$ is nonsingular, then it is invertible and there is an inverse matrix $A^{-1}$ so that $A^{-1}A = AA^{-1} = I_n$. Consider the transpose of $A^{-1}$, $\left(A^{-1}\right)^T$.
We learned in section 2.2 that $\left(BC\right)^T = C^TB^T$. So we let $C = A$ and $B = A^{-1}$. That gives us
$$
\begin{align*}
A^T\left(A^{-1}\right)^T &= \left(A^{-1}A\right)^T & &\text{Property of the Algebra of the Transpose} \\
&= \left(I_n\right)^T & &\text{$A^{-1}$ and $A$ are inverses} \\
&= I_n & &\text{$I$ is a diagonal matrix}
\end{align*}
$$
$\tombstone$
We have just shown that $A^T$ is invertible and its inverse is $\left(A^{-1}\right)^T$. That is
$$
\left(A^T\right)^{-1} = \left(A^{-1}\right)^T
$$
Hence if $A$ is nonsingular then $A^T$ is also nonsingular.
2.3.6 Exercises¶
Exercise 1¶
Compute $A^{-1}$ when
$$
A = \begin{bmatrix}\ \ 1 &\ \ 2 &\ \ 2 \\ \ \ 3 &\ \ 7 &\ \ 9 \\ -1 & -4 & -7 \end{bmatrix}
$$
Solution
To compute an inverse we create a partitioned matrix $\left[\,A\,|\,I_3\,\right]$. If one multiply this matrix on the left by $A^{-1}$ one obtains
$$ A^{-1}\left[\,A\,|\,I_3\,\right] = \left[\,A^{-1}A\,|\,A^{-1}I_3\,\right] = \left[\,I_3\,|\,A^{-1}\,\right] $$
We again apply row operations to reduce matrix $A$ in the left partition to row echelon form, the identity. The result in the right partition will be $A^{-1}$.
$$ \begin{align*} \left[\,A\,|\,I_3\,\right] &= \left[\begin{array}{ccc|ccc} \ \ 1 &\ \ 2 &\ \ 2 &\ \ 1 &\ \ 0 &\ \ 0 \\ \ \ 3 &\ \ 7 &\ \ 9 &\ \ 0 &\ \ 1 &\ \ 0 \\ -1 & -4 & -7 &\ \ 0 &\ \ 0 &\ \ 1 \end{array}\right]\begin{array}{c} \ \\ R_2 - 3R_1 \\ R_3 + R_1 \end{array} \\ \\ &\longrightarrow\left[\begin{array}{ccc|ccc}\ \ 1 &\ \ 2 &\ \ 2 &\ \ 1 &\ \ 0 &\ \ 0 \\ \ \ 0 &\ \ 1 &\ \ 3 & -3 &\ \ 1 &\ \ 0 \\ \ \ 0 & -2 & -5 &\ \ 1 &\ \ 0 &\ \ 1 \end{array}\right]\begin{array}{c} \ \\ \ \\ R_3 + 2R_2 \end{array} \\ \\ &\longrightarrow\left[\begin{array}{ccc|ccc}\ \ 1 &\ \ 2 &\ \ 2 &\ \ 1 &\ \ 0 &\ \ 0 \\ \ \ 0 &\ \ 1 &\ \ 3 & -3 &\ \ 1 &\ \ 0 \\ \ \ 0 &\ \ 0 &\ \ 1 & -5 &\ \ 2 &\ \ 1 \end{array}\right]\begin{array}{c} R_1-2R_3 \\ R_2-3R_3 \\ \ \end{array} \\ \\ &\longrightarrow\left[\begin{array}{ccc|ccc}\ \ 1 &\ \ 2 &\ \ 0 &\ 11 & -4 & -2 \\ \ \ 0 &\ \ 1 &\ \ 0 &\ 12 & -5 & -3 \\ \ \ 0 &\ \ 0 &\ \ 1 & -5 &\ \ 2 &\ \ 1 \end{array}\right]\begin{array}{c} R_1-2R_2 \\ \ \\ \ \end{array} \\ \\ &\longrightarrow\left[\begin{array}{ccc|ccc}\ \ 1 &\ \ 0 &\ \ 0 & -13 &\ \ 6 &\ \ 4 \\ \ \ 0 &\ \ 1 &\ \ 0 &\ 12 & -5 & -3 \\ \ \ 0 &\ \ 0 &\ \ 1 & -5 &\ \ 2 &\ \ 1 \end{array}\right] \end{align*} $$
From reducing the left partition to reduced row echelon form we have
$$ A^{-1} = \begin{bmatrix} -13 &\ \ 6 &\ \ 4 \\ 12 & -5 & -3 \\ -5 &\ \ 2 &\ \ 1 \end{bmatrix} $$
You can verify this by mutliplying $A^{-1}$ times $A$ to obtain the identity matrix $I_3$.
Exercise 2¶
Find the inverse of $A = \begin{bmatrix} 2 & 1 & 1 \\ 6 & 4 & 5 \\ 4 & 1 & 3 \end{bmatrix}$.
Solution
$$ \begin{align*} \left[\,A\,|\,I_3\,\right] &= \left[\begin{array}{ccc|ccc} 2 & 1 & 1 & 1 & 0 & 0 \\ 6 & 4 & 5 & 0 & 1 & 0 \\ 4 & 1 & 3 & 0 & 0 & 1 \end{array}\right]\ \begin{array}{l} \\ R_2 - 3R_1 \\ R_3 - 2R_1 \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 2 & 1 & 1 & 1 & 0 & 0 \\ 0 & 1 & 2 & -3 & 1 & 0 \\ 0 & -1 & 1 & -2 & 0 & 1 \end{array}\right]\ \begin{array}{l} \\ \\ R_3 + R_2 \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 2 & 1 & 1 & 1 & 0 & 0 \\ 0 & 1 & 2 & -3 & 1 & 0 \\ 0 & 0 & 3 & -5 & 1 & 1 \end{array}\right]\ \begin{array}{l} R_1-R_2 \\ \\ \\ \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 2 & 0 & -1 & 4 & -1 & 0 \\ 0 & 1 & 2 & -3 & 1 & 0 \\ 0 & 0 & 3 & -5 & 1 & 1 \end{array}\right]\ \begin{array}{l} \\ \\ \frac{1}{3}R_3 \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 2 & 0 & -1 & 4 & -1 & 0 \\ 0 & 1 & 2 & -3 & 1 & 0 \\ 0 & 0 & 1 & -\frac{5}{3} & \frac{1}{3} & \frac{1}{3} \end{array}\right]\ \begin{array}{l} R_1+R_3 \\ R_2-2R_3 \\ \\ \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 2 & 0 & 0 & \frac{7}{3} & -\frac{2}{3} & \frac{1}{3} \\ 0 & 1 & 0 & \frac{1}{3} & \frac{1}{3} & -\frac{2}{3} \\ 0 & 0 & 1 & -\frac{5}{3} & \frac{1}{3} & \frac{1}{3} \end{array}\right]\ \begin{array}{l} \frac{1}{2}R_1 \\ \\ \\ \end{array} \\ \\ &= \left[\begin{array}{ccc|ccc} 1 & 0 & 0 & \frac{7}{6} & -\frac{1}{3} & \frac{1}{6} \\ 0 & 1 & 0 & \frac{1}{3} & \frac{1}{3} & -\frac{2}{3} \\ 0 & 0 & 1 & -\frac{5}{3} & \frac{1}{3} & \frac{1}{3} \end{array}\right] \end{align*} $$
Thus
$$ A^{-1} = \begin{bmatrix}\ \ \frac{7}{6} & -\frac{1}{3} &\ \ \frac{1}{6} \\ \ \ \frac{1}{3} &\ \ \frac{1}{3} & -\frac{2}{3} \\ -\frac{5}{3} &\ \ \frac{1}{3} &\ \ \frac{1}{3} \end{bmatrix} $$
Your use of this self-initiated mediated course material is subject to our Creative Commons License 4.0