In this section we learn a quick way of finding a basis of a subspace in , which can also be used as a time-efficient test for linear independence. The trick is to use matrices instead of systems of linear equations (which should remind you of MATH105).
The following definitions consider the rows and columns of a matrix as vectors in , by interpreting the entries as coordinates for the standard basis.
Let be a matrix.
The row space of is the subspace in spanned by its rows,
The column space of is the subspace in spanned by its columns.
Consider the matrix . Then the row space of is
The column space of is
Let . Then
Row operations don’t change the row space of ,
Column operations don’t change the column space of .
One proves this by taking each type of e.r.o. separately, and assuming one has a general matrix, and a general e.r.o. of that type. Then one needs to prove that each new row (after the e.r.o.) is a linear combination of the old rows (before the e.r.o.). Argue similarly for e.c.o.’s. ∎
Let , and assume is an echelon form of (see Definition 1.9(iii)). Then the non-zero rows of form a basis for the row space of .
We obtain from through a sequence of e.r.o.’s, so by Theorem 2.48, and must have equal row spaces. Therefore the non-zero rows of span the row space of . To prove they form a basis, we need to prove linearly independence.
Let be the non-zero rows of , and assume . Since is in echelon form, the left-most non-zero coordinate in is zero for all the other . Therefore . Similarly, since is in echelon form, the left-most coordinate in is zero for all the other , ; hence . Continuing in this way (i.e. by induction), we see for all . Therefore the sequence is linearly independent. ∎
Notice that the matrix in the above Theorem does not need to be reduced row echelon form; so there are multiple correct bases.
[End of Exercise]
Find a basis for the row space of .
Solution 1: The reduced row echelon form is . By Theorem 2.49 a basis for this subspace is ; in particular, it is two dimensional.
Solution 2: We could have instead used the algorithm from Theorem 2.36, but it takes a bit longer. That procedure results in for a basis of the row space; these are the first two rows of the matrix.
Let be a sequence of vectors. Let be the matrix whose rows are the vectors in the sequence, and let be an echelon form of . The sequence is linearly independent if and only if has no zero rows.
Follows directly from Theorem 2.49. ∎
Is the following sequence of vectors linearly independent in ?
Solution: Form the matrix of row vectors, and row reduce it.
There is a zero row, so by Theorem 2.52 the original sequence is linearly dependent. Another way to see this is to use Theorem 2.49, which shows the subspace spanned by these 4 vectors is only 3 dimensional, therefore they must be linearly dependent.