Math 511: Linear Algebra
4.5 Basis And Dimension
4.5.1 Basis¶
$$ \require{color} \definecolor{brightblue}{rgb}{.267, .298, .812} \definecolor{darkblue}{rgb}{0.0, 0.0, 1.0} \definecolor{palepink}{rgb}{1, .73, .8} \definecolor{softmagenta}{rgb}{.99,.34,.86} \definecolor{blueviolet}{rgb}{.537,.192,.937} \definecolor{jonquil}{rgb}{.949,.792,.098} \definecolor{shockingpink}{rgb}{1, 0, .741} \definecolor{royalblue}{rgb}{0, .341, .914} \definecolor{alien}{rgb}{.529,.914,.067} \definecolor{crimson}{rgb}{1, .094, .271} \def\ihat{\mathbf{\hat{\unicode{x0131}}}} \def\jhat{\mathbf{\hat{\unicode{x0237}}}} \def\khat{\mathrm{\hat{k}}} \def\tombstone{\unicode{x220E}} \def\contradiction{\unicode{x2A33}} $$
Spanning Set¶
A spanning set for a vector space has to have enough vectors to make sure that every element of the vector space is some linear combination of the spanning set.
Example 1¶
The set $\left\{ \ihat,\ \jhat \right\}$ does not span all of $\mathbb{R}^3$; $\hat{k}$ is not a linear combination of $\ihat$ and $\jhat$. The set is too small; there are not enough vectors in this set to span all of $\mathbb{R}^3$.
Linearly Independent Set¶
A linearly independent in a vector space cannot have too many vectors. Too many vectors and one of them will be a linear combination of the others.
Example 2¶
The set $\left\{\,\begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix},\ \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}\,\right\}$ is linearly dependent because the fourth vector is a linear combination of the first three vectors.
Definition:¶
A basis for a vector space a linearly independent, spanning set.
Dr. Strang illustrates Independence, Basis and Dimension is his online video and notes.
Example 3¶
Show that the set $\left\{\,\begin{bmatrix} 1 \\ 2 \end{bmatrix},\ \begin{bmatrix} 2 \\ 1 \end{bmatrix}\,\right\}$ is a basis for $\mathbb{R}^2$.
We want to know if the linear system
$$
x_1\begin{bmatrix} 1 \\ 2 \end{bmatrix} + x_2\begin{bmatrix} 2 \\ 1 \end{bmatrix} = \mathbf{b}
$$
is consistent for every vector $\mathbf{b}\in\mathbb{R}^2$. This linear system corresponds to the linear equation
$$
\begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix}\mathbf{x} = \mathbf{b}
$$
We know that they are linearly independent because we can reduce matrix $A$
$$
A = \begin{bmatrix} 1 & 2 \\ 2 & 1 \end{bmatrix} \rightarrow
\begin{bmatrix} 1 &\ \ 2 \\ 0 & -3 \end{bmatrix} \rightarrow
\begin{bmatrix} 1 & 2 \\ 0 & 1 \end{bmatrix} \rightarrow
\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}
$$
Since both columns of matrix $A$ are pivot columns, they are linearly independent columns of matrix $A$. This also means that the matrix $A$ is a nonsingular matrix. We can conclude that the linear system is consistent for every vector $\mathbf{b}\in\mathbb{R}^2$. This establishes that the set of two vectors is a linearly independent, spanning set $\mathbb{R}^2$. Therefore these two vectors form a basis for $\mathbb{R}^2$.
4.5.2 Spanning Sets and Bases¶
Theorem 4.5.1¶
If $\left\{\mathbf{v}_1,\ \mathbf{v}_2,\ \dots,\ \mathbf{v}_n\right\}$ is a spanning set for vector space $V$, then any set of $m\gt n$ vectors $\left\{\mathbf{u}_1,\ \mathbf{u}_2,\ \dots,\ \mathbf{u}_m\right\}$ in $V$ is linearly dependent.
Proof:¶
Each vector $\mathbf{u}_j\in V$ can be written as a linear combination
$$ \mathbf{u}_j = \displaystyle\sum_{k=1}^n a_{jk}\mathbf{v}_k = a_{j1}\mathbf{v}_1 + a_{j2}\mathbf{v}_2 +\ \cdots\ + a_{jn}\mathbf{v}_n $$
because the set $\left\{\mathbf{v}_1,\ \mathbf{v}_2,\ \dots,\ \mathbf{v}_n\right\}$ spans vector space $V$. Any linear combination of the set $\left\{\mathbf{u}_1,\ \mathbf{u}_2,\ \dots,\ \mathbf{u}_m\right\}$ can be written
$$ \begin{align*} \displaystyle\sum_{i=1}^n c_i\mathbf{u}_i &= c_1\mathbf{u}_1 + c_2\mathbf{u}_2 +\ \cdots\ + c_n\mathbf{u}_n \\ \\ &= c_1\left(\displaystyle\sum_{k=1}^n a_{1k}\mathbf{v}_k\right) + c_2\left(\displaystyle\sum_{k=1}^n a_{2k}\mathbf{v}_k\right) +\ \cdots\ c_m\left(\displaystyle\sum_{k=1}^n a_{mk}\mathbf{v}_k\right) \\ \\ &= \left(\displaystyle\sum_{k=1}^n c_1a_{1k}\mathbf{v}_k\right) + \left(\displaystyle\sum_{k=1}^n c_2a_{2k}\mathbf{v}_k\right) +\ \cdots\ + \left(\displaystyle\sum_{k=1}^n c_ma_{mk}\mathbf{v}_k\right) \\ \\ &= \displaystyle\sum_{i=1}^m \left(\displaystyle\sum_{k=1}^n c_ia_{ik}\mathbf{v}_k\right) = \displaystyle\sum_{k=1}^n \left(\displaystyle\sum_{i=1}^m c_ia_{ik}\mathbf{v}_k\right) = \displaystyle\sum_{k=1}^n \left(\displaystyle\sum_{i=1}^m c_ia_{ik}\right)\mathbf{v}_k \\ \end{align*} $$
Consider the equation
$$ c_1\mathbf{u}_1 + c_2\mathbf{u}_2 +\ \cdots\ + c_n\mathbf{u}_n = \mathbf{0} $$
Using the decomposition above we obtain the linear system of equations
$$
\begin{align*}
\sum_{i=1}^m c_ia_{i1} &= a_{11}c_1 + a_{21}c_2 + \cdots + a_{m1}c_m = 0 \\
\\
\sum_{i=1}^m c_ia_{i2} &= a_{12}c_1 + a_{22}c_2 + \cdots + a_{m2}c_m = 0 \\
\\
\ddots \\
\\
\sum_{i=1}^m c_ia_{ik} &= a_{1k}c_1 + a_{2k}c_2 + \cdots + a_{m2}c_k = 0 \\
\\
\ddots \\
\\
\sum_{i=1}^m c_ia_{in} &= a_{1n}c_1 + a_{2n}c_2 + \cdots + a_{mn}c_m = 0 \\
\\
\end{align*}
$$
This system expressed in matrix form
$$
A\mathbf{c} = \mathbf{0}
$$
is underdetermined because $m\gt n$. This system is consistent because $c_1=c_2=\dots=c_m=0$ is a solution, and thus has infinitely many solutions because the linear system is underdetermined. Let $\mathbf{\hat{c}}\neq\mathbf{0}$ be such a nontrivial solution. Then
$$
\displaystyle\sum_{i=1}^n c_i\mathbf{u}_i = \displaystyle\sum_{k=1}^n \left(\displaystyle\sum_{i=1}^m c_ia_{ik}\right)\mathbf{v}_k = \displaystyle\sum_{k=1}^n 0\mathbf{v}_k = \mathbf{0}.
$$
The existence of a nontrivial solution implies that the larger set of vectors $\left\{\mathbf{u}_1,\ \mathbf{u}_2,\ \dots,\ \mathbf{u}_m\right\}$ is linearly dependent.
$\unicode{x220E}$
Corollary 4.5.2¶
If $\left\{\mathbf{v}_1,\ \mathbf{v}_2,\ \dots,\ \mathbf{v}_n\right\}$ and $\left\{\mathbf{u}_1,\ \mathbf{u}_2,\ \dots,\ \mathbf{u}_m\right\}$ are both bases for a finite dimensional vector space $V$, then $m = n$.
Proof:¶
- because $V$ is a spanning set and $U$ is linearly independent, $m\le n$.
- because $U$ is a spanning set and $V$ is linearly independent, $n\le m$.
Therefore, $m=n$. $\unicode{x220E}$
As we noted in section 4.5.1,¶
- a set of vectors with too few vectors cannot span the entire vector space.
- a set of vectors with too many vectors cannot be linearly independent.
4.5.3 Canonical Bases¶
The canonical basis or standard basis for a finite dimensional vector space is represented by the set of column vectors
$$ \mathbf{e}_1 = \begin{bmatrix} 1 \\ 0 \\ \vdots \\ 0 \\ \vdots \\ 0 \end{bmatrix},\ \mathbf{e}_2 = \begin{bmatrix} 0 \\ 1 \\ \vdots \\ 0 \\ \vdots \\ 0 \end{bmatrix},\ \cdots,\ \mathbf{e}_k = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 1 \\ \vdots \\ 0 \end{bmatrix},\ \cdots,\ \mathbf{e}_n = \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 0 \\ \vdots \\ 1 \end{bmatrix} $$
The standard basis for $\mathbb{R}^2$ is
$$
\left\{\,\mathbf{e}_1, \mathbf{e}_2\,\right\} = \left\{\,\ihat,\ \jhat\,\right\} = \left\{\,\begin{bmatrix} 1 \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 1 \end{bmatrix}\,\right\}
$$
The standard basis for $\mathbb{R}^3$ is
$$
\left\{\,\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3\,\right\} = \left\{\,\ihat,\ \jhat,\ \khat\,\right\} = \left\{\ \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}\ \right\}
$$
The standard basis for $\mathbb{R}^n$ is
$$
\left\{\,\mathbf{e}_1, \mathbf{e}_2, \mathbf{e}_3\,\ \dots,\ \mathbf{e}_n\,\right\} = \left\{\ \begin{bmatrix} 1 \\ 0 \\ \vdots \\ 0 \end{bmatrix},\ \begin{bmatrix} 0 \\ 1 \\ \vdots \\ 0 \end{bmatrix},\ \dots,\ \begin{bmatrix} 0 \\ 0 \\ \vdots \\ 1 \end{bmatrix}\ \right\}
$$
There are many, many, many other bases for these vector spaces. Such a basis is given in Example 3.
4.5.4 Dimension¶
Definition:¶
The dimension, Hamel dimension, or algebraic dimension of a vector space is the number of vectors in any basis if it finite, or the cardinal if it is infinite.
Examples¶
- The dimension of $\mathbb{R}^2$ is two.
- The dimension of $\mathbb{R}^3$ is three.
- The dimension of $\mathbb{R}^n$ is $n$.
- The dimension of $\mathbb{R}^1$ is one.
- The dimension of the trivial vector space $\left\{\,\mathbf{0}\,\right\}$ is zero.
- The dimension of $\mathbb{R}^{m\times n}$ is $m\cdot n$.
- The dimension of $P$ is infinite.
- The dimension of $C[-1,1]$ is infinite.
Theorem 4.5.3¶
If $V$ is a finite dimensional vector space of dimension $n \gt 0$, then
(i) any set of $n$ linearly independent vectors is a basis for $V$, and hence a spanning set of $V$.
(ii) any set of $n$ vectors that spans $V$ is a basis, and therefore linearly independent.
Proof:¶
Let $V$ be a finite dimensional vector space with dimension $n \gt 0$. This implies that there must be a basis, or linearly independent spanning set $\mathfrak{B} = \left\{ \mathbf{v}_1, \dots, \mathbf{v}_n \right\}$ for $V$.
(i) Further let $U = \left\{ \mathbf{u}_1, \dots, \mathbf{u}_n \right\}$ be a set of $n$ linearly independent vectors. For any vector nonzero $\mathbf{w}\in V-U$, the set $U\cup\left\{\mathbf{w}\right\}$ has $n+1$ elements in it. We created a spanning set $\mathfrak{B}$ with $n$ elements in it, and a set $U\cup\left\{\mathbf{w}\right\}$ with $n+1\gt n$ elements, so by Theorem 4.5.1, $U\cup\left\{\mathbf{w}\right\}$ is a linearly dependent set of vectors. Therefore, there is a nontrivial linear combination of these vectors whose result is the zero vector.
$$ c_1\mathbf{u}_1 + \dots + c_n\mathbf{u}_n + c_{n+1}\mathbf{w} = \mathbf{0} $$
Moreover, $c_{n+1}\neq 0$ because if $c_{n+1}=0$, then $c_1\mathbf{u}_1 + \dots + c_n\mathbf{u}_n=\mathbf{0}$. As $U$ is a linearly independent set, all of the coefficients $c_1=\dots=c_{n}=c_{n+1}=0$. This is not possible as the linear combination is nontrivial. Hence
$$ \begin{align*} c_1\mathbf{u}_1 + \dots + c_n\mathbf{u}_n &= -c_{n+1}\mathbf{w} \\ \\ \mathbf{w} = -\frac{c_1}{c_{n+1}}\mathbf{u}_1 - \dots - \frac{c_n}{c_{n+1}}\mathbf{u}_n \end{align*} $$
This proves that $\mathbf{w}$ is a linear combination of the vectors in $U$, so $\mathbf{w}\in\text{Span}(U)$. Since every nonzero vector in $V$ is an element of the span of $U$, $V=\text{Span}(U)$ and $U$ is a linearly independent spanning set. In other words, $U$ is a basis for $V$.
(ii) Let $U = \left\{ \mathbf{u}_1, \dots, \mathbf{u}_n \right\}$ be a spanning set of $V$ with exactly $n$ elements. Suppose that set $U$ is linearly dependent. Then $V = \text{Span}(U) = \text{Span}\left\{ \mathbf{u}_{j_1},\dots,\mathbf{u}_{j_k}\right\}$, where $k\lt n$. However we have a linearly independent set $\mathfrak{B}$ with $n$ elements and a set with $\left\{ \mathbf{u}_{j_1},\dots,\mathbf{u}_{j_k}\right\}$ with $k\lt $ elements. Using the contra-positive of Theorem 4.5.1, $\left\{ \mathbf{u}_{j_1},\dots,\mathbf{u}_{j_k}\right\}$ does not span $V$. Since a set cannot be both a spanning set and not a spanning set, the premise that $U$ is linearly dependent is false. Thus $U$ is linearly independent, and $U$ is a basis for $V$. $\unicode{x220E}$
Proving that every infinite dimensional vector space has a basis, and any two bases for an infinite dimensional vector space are sets with the same cardinality, is beyond the scope of this course. We need to understand, utilize, and remember the following theorems about vector spaces.
Theorem 4.5.4¶
Every vector space has a basis or linearly independent spanning set. Any two bases for a finite dimensional vector space has the same number of elements. Any two bases for an infinite dimensional vector space has the same cardinality.
When we are working with linear models in applications we will also frequently use the following corollary to theorem 4.5.4
Corollary 4.5.5¶
If $ V$ is a vector space, then
(i) any linearly independent vectors can be extended to a basis for vector space $V$.
(ii) any spanning set can be reduced by removing vectors to form a basis for vector space $V$.
In part (i) we mean that we can judiciously add linearly independent vectors to a linearly independent set until it spans the vector space. This process yields a basis for $V$.
In part (ii) we mean that we can carefully remove dependent vectors from a spanning set until all that remains is a linearly independent, spanning set. Again this process yields a basis for $V$.
4.5.5 Infinite Dimensional Vector Spaces¶
Definition:¶
A vector space that cannot be spanned by a finite number of vectors is an infinite dimensional vector space.
In this class we will focus mainly on
- finite dimensional vector spaces,
- infinite dimensional vector spaces with a countable basis, and
- infinite dimensional vector spaces whose elements are a limits of a countable complete orthonormal set or Hilbert basis.
We will not discuss at length the differences between cardinals, the difference between the first countable ordinal and its cardinal $\omega$, or the first uncountable ordinal and its cardinal $\Omega$.
A countable set $A$ has the same number of elements as the positive integers. This means there is a bijection $\Phi:\mathbb{Z}\longleftrightarrow A$. This bijection yields exactly one element of $A$ for each positive integer; $\Phi(k)\in A$ is unique. $\Phi^{-1}(a)$ is also unique for every element of $A$. We usually denote an element of set $A$ with its pre-image or index
$$ \Phi(k) = a_k $$
In this way we often denote set $A$ as a sequence of elements just like the positive integers
$$ \begin{align*} A &= \left\{\,a_1,\ a_2,\ a_3,\ a_4,\ \dots\,\right\} \\ \\ \mathbb{Z} &= \left\{\,1,\ 2,\ 3,\ 4,\ \dots\,\right\} \end{align*} $$
This allows us to use simple or strong mathematical induction, and avoids transfinite recursion.
Example 4¶
Consider the vector space $P$ of all polynomials with real (or complex) coefficients. Every polynomial is a finite sum of polynomial terms. Recall the finite dimensional vector space $P_n$ of all polynomials of degree less than $n$ for any positive integer $n$. The polynomial
$$ p(x) = a_0x^m + a_1x^{m-1} + a_2x^{m-2} +\ \cdots\ + a_{m-1}x + a_m $$
is a vector in vector space $P_{m+1}$. This polynomial is also an element of every vector space $P_n$ for $n\gt m$ if we prepend $p(x)$ with the appropriate number of $0x^k$ terms for $m\lt k\lt n$,
$$ p(x) \cong
0x^n + 0x^{n-1} +\ \cdots\ +a_0x^m + a_1x^{m-1} + a_2x^{m-2} +\ \cdots\ + a_{m-1}x + a_m $$
For every non-negative integer $n$, the polynomial $\ x^n$ is an element of vector space $P$. Furthermore they are linearly independent because there is no way to multiply one by a scalar to get one of the others. The set
$$
\mathfrak{B} = \left\{\,1,\ x,\ x^2,\ x^3,\ \dots,\ x^n,\ \dots\,\right\}
$$
is a countably infinite set of linearly independent vectors in vector space $P$ that spans $P$. Vector space $P$ is infinite dimensional vector space with basis $\mathfrak{B}$.
It is important to remember that every element of vector space $P$ is a finite linear combination of basis vectors in $\mathfrak{B}$.
Base $\mathfrak}{B}$ is can also be called a maximal linearly independent set. They are referred to as maximally independent because if we add any vector from $q\inP$ (polynomial $q(x)$) to our maximal set $\mathbfrak{B}$, then the new set $\mathbfrak{b}\cup\left\{q\right\}$ would be linearly dependent.
Definition¶
A maximal linearly independent set of vectors $U$ in a vector space $V$ is a linearly independent set that with the property that adding any vector in $\mathbf{x}\in V-U$ to our maximal set $U$ results in a linearly dependent set.
Example 5¶
(a) The set $\left\{\ihat,\ \jhat\right\}$ in $\mathbb{R}^3$ is a linearly independent set, but it is not maximal because we can add $\khat$ and still get a linearly independent set $\left\{\ihat,\ \jhat,\ \khat\right\}$.
(b) The set $\left\{\,\mathbf{e}_1,\ \mathbf{e}_2,\ \mathbf{e}_3,\ \mathbf{e}_4\,\right\}$ is a maximal linearly independent set in $\mathbb{R}^4$.
(c) The set $\mathfrak{B} = \left\{\,1,\ x,\ x^2,\ x^3,\ x^4,\ \dots\,\right\}$ is a maximal linearly independent set in the vector space of polynomials, $P$.
Example 6¶
The vector space of continuous function on the real interval $\ [a,b]$ contains $P$ because each one of its basis vectors is a continuous function on any interval.
Notice that to check if $P$ is a subspace of $C[a,b]$ we need only check to see that every basis vector in $P$ is an element of $C[a,b]$.
Thus $C[a,b]$ is also an infinite dimensional vector space. However does it have a countable basis? This will require calculus to determine. This should be unsurprising since continuous is a calculus idea. Grant Sanderson demonstrates how to create a Taylor Series for a continuous function defined on a closed and bounded interval.
Theorem 4.5.6¶
Every continuous function defined on a closed and bounded interval $\ [a,b]$, $f:\,[a,b]\rightarrow\mathbb{R}$, has an absolutely convergent power series
$$ f(x) = \displaystyle\lim_{N\rightarrow\infty}\ \displaystyle\sum_{k=0}^{N} a_kx^k := \displaystyle\sum_{k=1}^{\infty} a_kx^k $$
Every continuous functions $f:\,[a,b]\rightarrow\mathbb{R}$ also has an absolutely convergent Taylor Series
$$ f(x) = \displaystyle\sum_{k=0}^{\infty} a_k(x-x_0)^k $$
These conclusions are derived from the Stone-Weierstrauss Threorem.
Notice that every partial sum
$$ s_n(x) = \displaystyle\sum_{k=0}^n a_kx^k = a_0 + a_1x + \dots + a_nx^n $$
is a polynomial in $P$. Even though $C[a,b]$ is MUCH bigger that $P$, we can use calculus to describe every element of $C[a,b]$ as a limit of elements in $P$. This will be an important part of our discussion of the really important vector spaces.
The polynomials in $P$ are also $d$-times continuously differentiable. Hence the vector space $C^d[a,b]$ is an infinite dimensional vector space for every positive integer $d$. In fact,
$$ P \subset C^{\infty}[a,b] \subset \dots \subset C^d[a,b] \subset \dots \subset C^2[a,b] \subset C^1[a,b] \subset C^0[a,b] = C[a,b] $$
There is no countable basis for the vector spaces $C^{\infty}[a,b]$ or $C^d[a,b]$. Instead we search for a countable, linearly independent set whose span is dense in the topological vector space $C^{\infty}[a,b]$ and $C^d[a,b]$.
A linearly independent subset of vectors whose span is dense in a topological vector space $V$ is called a complete orthonormal set or Hilbert basis if each of the vectors in the subset have unit length.
To discuss the concept of the length of a vector and the distance between vectors requires adding geometry to our vector spaces. We will add geometry to vector spaces in chapter 5.
4.5.6 Extending a Basis¶
In the previous section one reduces a matrix $A\in\mathbb{R}^{m\times n}$ (or $\mathbb{C}^{m\times n}$) into upper triangular, row echelon or reduced row echelon form to obtain the pivot columns. The pivot columns of the original matrix $A$ are linearly independent and form a basis for the column space $C(A)$. Then one uses backward substitution to solve for a basis for the null space $N(A)$.
One could repeat this process upon $A^T$ to obtain the row space $C(A^T)$ and the left null space $N(A^T)$. However, is there an easier way to obtain the row space $C(A^T)$?
Each pivot column is also a pivot row. Moreover we utilize row operations or linear combinations of rows to obtain the reduced row echelon form of a matrix. These row operations can be combined into an invertible matrix $L$ so that upper triangular matrix
$$ A = LU $$
If the row operations are restricted to type III row operations only, and in order from second row to last, and from left to right, then matrix $L$ is a product of type III elementary matrices and lower triangular. Matrix $L$ is invertible because it is a product of invertible elementary matrices.
Theorem 4.5.7¶
An invertible transformation of a set of linearly independent vectors is linearly independent.
Proof:¶
Suppose we have a set of linearly independent vectors $\left\{\,\mathbf{v}_1,\ \dots,\ \mathbf{v}_n\,\right\}$ in vector space $V$, and an $n\times n$ invertible matrix $S$. Then if we have coefficients $c_1$, $\dots$, $c_n$ of the homogeneous linear combination
$$ c_1S\mathbf{v}_1 + \dots + c_nS\mathbf{v}_n = \mathbf{0}, $$
then
$$ \begin{align*} \mathbf{0} &= c_1S\mathbf{v}_1 + \dots + c_nS\mathbf{v}_n = \mathbf{0} = S\left( c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n \right) \\ \mathbf{0} &= S^{-1}\mathbf{0} = c_1\mathbf{v}_1 + \dots + c_n\mathbf{v}_n \end{align*} $$
Hence $c_1=\dots=c_n=0$ since the vectors $\left\{\,\mathbf{v}_1,\ \dots,\ \mathbf{v}_n\,\right\}$ are linearly independent set of vectors. Thus our vectors $\left\{\,S\mathbf{v}_1,\ \dots,\ S\mathbf{v}_n\,\right\}$ is also a linearly independent set. $\unicode{x220E}$
Corollary 4.5.8¶
The pivot rows in the row echelon form of a matrix $A\in\mathbb{R}^{m\times n}$ (or $\mathbb{C}^{m\times n}$) are the transposes of a set of basis vectors for the row space $C(A^T)$.
This means that we can use the transposes of the pivot rows of the
- original matrix $A$,
- upper triangular form,
- row echelon form
- reduced row echelon form of matrix $A$.
4.5.7 Some Exercises¶
Exercise 1¶
Show that the set of vectors $\left\{\,\begin{bmatrix}\ 0\ \\ \ 4\ \\ -2\ \\ \ 1\ \end{bmatrix},\ \begin{bmatrix} \ 0\ \\ \ 3\ \\ \ 3\ \\ -1\ \end{bmatrix},\ \begin{bmatrix} \ 1\ \\ \ 1\ \\ \ 2\ \\ \ 2\ \end{bmatrix},\ \begin{bmatrix} \ 3\ \\ -1\ \\ \ 2\ \\ \ 2\ \end{bmatrix}\,\right\}$ is a basis for $\mathbb{R}^4$.
View Solution
We need to show that these four vectors are linearly independent in a four dimensional vector space. Create the $4\times 4$ matrix $A = \begin{bmatrix} \mathbf{u}_1 & \mathbf{u}_2 & \mathbf{u}_3 & \mathbf{u}_4 \end{bmatrix}$ and reduce it to upper triangular form to determine the number of pivot columns.$$ \begin{align*} A &= \begin{bmatrix} \ \ 0\ &\ \ 0\ &\ \ 1\ &\ \ 3\ \\ \ \ 4\ &\ \ 3\ &\ \ 1\ & -1\ \\ -2\ &\ \ 3\ &\ \ 2\ &\ \ 2\ \\ \ \ 1\ & -1\ &\ \ 2\ &\ \ 2\ \end{bmatrix} \longrightarrow \begin{bmatrix} \ \ 1\ & -1\ &\ \ 2\ &\ \ 2\ \\ \ \ 4\ &\ \ 3\ &\ \ 1\ & -1\ \\ -2\ &\ \ 3\ &\ \ 2\ &\ \ 2\ \\ \ \ 0\ &\ \ 0\ &\ \ 1\ &\ \ 3\ \end{bmatrix} \longrightarrow \begin{bmatrix} \ \ 1\ & -1\ &\ \ 2\ &\ \ 2\ \\ \ \ 0\ &\ \ 7\ & -7\ & -9\ \\ \ \ 0\ &\ \ 1\ &\ \ 6\ &\ \ 6\ \\ \ \ 0\ &\ \ 0\ &\ \ 1\ &\ \ 3\ \end{bmatrix} \\ \\ &= \begin{bmatrix} \ \ 1\ & -1\ &\ \ 2\ &\ \ 2\ \\ \ \ 0\ &\ \ 1\ &\ \ 6\ &\ \ 6\ \\ \ \ 0\ &\ \ 0\ &\ \ 1\ &\ \ 3\ \\ \ \ 0\ &\ \ 0\ & -49\ & -51\ \end{bmatrix} \longrightarrow \begin{bmatrix} \ \ 1\ & -1\ &\ \ 2\ &\ \ 2\ \\ \ \ 0\ &\ \ 1\ &\ \ 6\ &\ \ 6\ \\ \ \ 0\ &\ \ 0\ &\ \ 1\ &\ \ 3\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ & \ 96\ \end{bmatrix} \end{align*} $$
All four columns of matrix $A$ are pivot columns, so all four column vectors are linearly independent. Therefore the set of vectors is a basis for $\mathbb{R}^4$.
Exercise 2¶
Show that the set of vectors
$$
\mathbf{x}_1 = \begin{bmatrix}\ 2\ \\ \ 1\ \\ \ 3\ \end{bmatrix},\qquad\mathbf{x}_2 = \begin{bmatrix}\ 1\ \\ \ 3\ \\ \ 2\ \end{bmatrix}
$$
is not a basis for $\mathbb{R}^3$.
View Solution
The two given vectors are linearly independent, but there are only two vectors for a three dimensional vector space. Moreover
Therefore the set of vectors $\left\{ \mathbf{x}_1, \mathbf{x}_2 \right\}$ is <b>not</b> a basis for $\mathbb{R}^3$ because there are not enough vectors.
<br>
Exercise 3¶
Show that the set of vectors
$$
U = \left\{\,\begin{bmatrix} \ 2 \\ \ 1 \\ -1 \end{bmatrix},\ \begin{bmatrix} \ 3 \\ \ 2 \\ \ 0 \end{bmatrix},\ \begin{bmatrix} \ 1 \\ \ 0 \\ -2 \end{bmatrix},\ \begin{bmatrix} -1 \\ -2 \\ -1 \end{bmatrix}\,\right\}
$$
is linearly dependent and find a subset of these vectors that constitute a basis for $\mathbb{R}^3$.
View Solution
The set $U$ is a set of four vectors in a three dimensional vector space so they are linearly dependent! To distinguish the linearly independent vectors from the linearly dependent vectors form the matrix $A = \begin{bmatrix} \mathbf{u}_1 & \mathbf{u}_2 & \mathbf{u}_3 & \mathbf{u}_4 \end{bmatrix}$ and reduce it to reduced row echelon form.$$ \begin{align*} A &= \begin{bmatrix} \ \ 2\ &\ \ 3\ &\ \ 1\ & -1\ \\ \ \ 1\ &\ \ 2\ &\ \ 0\ & -2\ \\ -1\ &\ \ 0\ & -2\ & -1\ \end{bmatrix}\longrightarrow \begin{bmatrix} \ \ 1\ &\ \ 1\ &\ \ 1\ &\ \ 1\ \\ \ \ 1\ &\ \ 2\ &\ \ 0\ & -2\ \\ -1\ &\ \ 0\ & -2\ & -1\ \end{bmatrix}\longrightarrow \begin{bmatrix} \ \ 1\ &\ \ 1\ &\ \ 1\ &\ \ 1\ \\ \ \ 0\ &\ \ 1\ & -1\ & -3\ \\ \ \ 0\ &\ \ 1\ & -1\ &\ \ 0\ \end{bmatrix} \\ \\ &\longrightarrow \begin{bmatrix} \ \ 1\ &\ \ 1\ &\ \ 1\ &\ \ 1\ \\ \ \ 0\ &\ \ 1\ & -1\ & -3\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ &\ \ 3\ \end{bmatrix}\longrightarrow \begin{bmatrix} \ \ 1\ &\ \ 1\ &\ \ 1\ &\ \ 0\ \\ \ \ 0\ &\ \ 1\ & -1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ &\ \ 1\ \end{bmatrix}\longrightarrow \begin{bmatrix} \ \ 1\ &\ \ 0\ &\ \ 2\ &\ \ 0\ \\ \ \ 0\ &\ \ 1\ & -1\ &\ \ 0\ \\ \ \ 0\ &\ \ 0\ &\ \ 0\ &\ \ 1\ \end{bmatrix} \end{align*} $$
Columns one, two and four are pivot columns and column three is a free column, so vectors $\mathbf{u}_1$, $\mathbf{u}_2$, $\mathbf{u}_4$ are linearly independent. However using the reduced row echelon form of matrix $A$, $\mathbf{a}_3 = 2\mathbf{a}_1 - \mathbf{a}_2$.
Since the pivot columns are three linearly independent vectors in a three dimensional vector space the set
$$ \left\{\,\begin{bmatrix} \ 2 \\ \ 1 \\ -1 \end{bmatrix},\ \begin{bmatrix} \ 3 \\ \ 2 \\ \ 0 \end{bmatrix},\ \begin{bmatrix} -1 \\ -2 \\ -1 \end{bmatrix}\,\right\} $$
is a basis for vector space $\mathbb{R}^3$.
Your use of this self-initiated mediated course material is subject to our Creative Commons License 4.0