You should be familiar with writing any vector in as a linear combination of the standard basis vectors, defined in Exercise 2.22(ii). For example, we can write the vector in . Since there is exactly one way to write every vector in as a linear combination of the sequence , this sequence forms a basis.
Let be a finite sequence of vectors in a vector space over a field . We say forms a basis (plural: bases) when spans and is linearly independent.
Find 3 different bases of .
Find a basis of such that all coordinates of all the vectors are non-zero.
[End of Exercise]
A sequence of vectors in forms a basis if and only if every vector can be written uniquely as a linear combination of the vectors in .
In Example 2.21(i), we proved that
are three linearly independent vectors in , and so they form a basis (see Theorem 2.38). Therefore we should be able to write any vector, such as , as a linear combination of these vectors in a unique way. Assume
for some . Then we obtain a system of equations:
Solving these produces the unique solution , , and . Therefore
How many ways (if any) can you express as a linear combination of the three vectors and ?
Prove Theorem 2.26.
[End of Exercise]
The complex numbers form a vector space over . A basis of this vector space is given by and . This is because and span the complex numbers (we can always write complex numbers as ), and they are linearly independent, because if for , then .
When is considered as a vector space over the field , the vectors and are linearly dependent (so they don’t form a basis). This is because has non-trivial solutions for . For example, and .
Prove that and together form a basis of , viewed as a vector space over .
Find all complex numbers such that and together form a basis of , viewed as a vector space over .
Prove that the polynomials form a basis of .
[End of Exercise]
If a vector space over has a basis with elements, then we say has dimension . We also write (or even , if we want to emphasize the field). We will say that the zero vector space has dimension zero.
We need to use some caution here, because one might ask: Can a vector space have two different bases, with different numbers of elements? Because if so, then the above definition doesn’t make any sense. Fortunately we have the following theorem (which, logically, should go before the above definition).
If has two bases and , then .
The argument of the following proof actually demonstrates something stronger: if a set of size spans a vector space, then any linearly independent set has at most elements. If a vector space has a basis with a finite number of elements, then we say it is finite-dimensional. Otherwise, it is infinite-dimensional.
Assume we have two such bases. Then we can write each of the elements as a linear combination of the basis:
for each .
Since we have assumed linear independence of the sequence , there are no non-trivial solutions which satisfy the following equations:
By V9 (and V8) this is equal to
which by V3 is equal to
which by V10 is equal to
Since the sequence of ’s are linearly independent, this equation implies for each . Since the numbers are fixed, this is a system of (linear homogeneous) equations in the unknowns . The only way such a system can have no non-trivial solutions is to have . This is because if there are more variables than equations, one can always set one of the variables as a parameter, and still find a solution. This was seen in MATH105.
The same argument with the roles of and reversed proves . Therefore , and hence . ∎
Subspaces in either have dimension 0 (the zero subspace), dimension 1 (a straight line through ), or dimension 2 (all of ).
Subspaces in either have dimension 0,1,2, or 3. Dimension 2 subspaces are always planes through .
is a 2-dimensional real vector space; it has a basis over the field .
Find a basis for each of the following vector spaces.
as a vector space over the field .
as a vector space over the field .
as a vector space over the field .
as a vector space over the field .
as a vector space over the field .
If is a complex vector space, make a guess about how compares to .
[End of Exercise]
Let be a finite-dimensional vector space over a field , and assume is a set of vectors that spans . Then there is a sequence of vectors in that forms a basis of .
Let’s construct a sequence as follows:
Step 1: Choose any non-zero vector , and add it to the sequence.
Step 2: If the span of the sequence so far is all of , then the algorithm ends; otherwise proceed to Step 3.
Step 3: Choose a vector which is not in the span of the sequence so far. Add it to the sequence; the resulting sequence is still linearly independent. Return to Step 2.
Since is finite-dimensional, this algorithm must terminate. The resulting sequence is linearly independent and spans . ∎
[Technical aside: This argument doesn’t work for infinite-dimensional vector spaces. One way of generalizing the term “basis” to infinite-dimensional vector spaces, is an infinite set of linearly independent elements, whose set of (finite) linear combinations is the entire vector space. Then the proof that every infinite-dimensional vector space has a basis requires the Axiom of Choice, which is accepted by most mathematicians. A different way of generalizing the term “basis” is used in MATH317. ]
The following is a consequence of the algorithm used in the proof of Theorem 2.36.
If is a linearly independent sequence in a finite-dimensional vector space , then it can be extended to a basis of . In other words, we can find vectors such that is a basis of .
Combining the above facts, we obtain the following theorem, which gives a convenient condition for a sequence of vectors to be a basis.
Let be vectors in an -dimensional vector space .
If is a linearly independent sequence, then it is a basis of .
If spans , then they form a basis of .
Assume is a linearly independent sequence. By Corollary 2.37, we can find vectors in such that is a basis of .
But the statement of the Theorem assumes the dimension of is equal to , and so every basis has elements (Theorem 2.33). In particular, . This means that the original sequence was a basis to begin with, which is what we wanted to prove.
Let . Is there a basis of consisting of some subset of these vectors?
Solution: Applying the algorithm of Theorem 2.36, take into the sequence. Next, since is not a linear combination of , add it as well. Next, is ? If it is, then for some . When we expand this expression, we obtain a system of equations. Solving that system gives and . Hence . So discard .
Finally, is ? Assume are such that
We get the following equations: and and By solving that system of equations we see there are no solutions, and therefore is linearly independent, and by Theorem 2.38, it must form a basis of .
Prove that the polynomials , , , , form a basis of .
Extend each of the following linearly independent sequences to a basis of the vector space by adding vectors to the sequence.
in .
.
The polynomials and in .
[End of Exercise]