Given a matrix , and an eigenvalue , in the previous section we put a lot of effort into finding a basis for each of the generalized eigenspaces of . They are subspaces, and by Theorem 6.21 there is a number such that:
An important observation is that if we pick a vector in one of these subspace, repeatedly multiplying that vector by the matrix moves it along these subspaces from the right to the left, creating a “chain” of vectors. In other words:
If , for some , then
The proof is because means that , by definition of the generalized eigenspace. This implies that , which is what we wanted to prove.
Notice that this formula still works with , because we have that , the zero subspace. ∎
Let .
Prove that and that .
Find a non-zero vector which is in but is not in .
Prove that .
[End of Exercise]
In the following definition, notice that a “Jordan chain of length 1 for ” is exactly the same thing as an “eigenvector for ”.
Recall that if are sets, then means that but .
Given a square matrix and an eigenvalue , a sequence of vectors is called a Jordan chain of length for if:
, and
, for every .
A Jordan basis (for ) is a basis of which consists only of Jordan chains (for different eigenvalues, in general).
Any Jordan chain, , must obey . This is because is another way of saying .
Let . You may use that
Find a Jordan chain of length 3.
Solution: First pick an element in . Say . Then define and . Then sequence is a Jordan chain of length 3. In fact this sequence is a Jordan basis, since it is a basis of and it is made up of Jordan chains (in this case, just one).
Let as in Example 6.23. Find a Jordan basis.
(Solution:) First we find the generalized eigenspaces for each eigenvalue.
: All of the generalized eigenspaces are 1-dimensional and are spanned by the vector . In particular, this eigenvector forms a Jordan chain of length 1 for the eigenvalue , and no longer chains are possible.
: Based on our computation of the dimensions of the generalized eigenspaces, a Jordan chain for has length at most 2. Let’s choose a vector in . One such vector is . Then . So is a Jordan chain of length 2.
Now forms a basis of , so our search ends. In other words, we have found a Jordan basis for ; it is the union of two Jordan chains.
Let’s see how the matrix from Example 6.30 looks in the new Jordan basis
If is the associated linear transformation, then we want to compute . Using the method from Section 4.A we compute:
This calculation shows that
Alternately, one could do a much longer calculation by using the change of basis matrix from to the standard basis:
Then we need to compute the inverse of , and finally verify that gives the same matrix as above.
The resulting matrix is not diagonal, but it is as close as we can get to diagonalizing. In the next section we will see that this matrix is in Jordan normal form.
Find a Jordan basis for (see also Exercise 6.18).
Find a Jordan basis for (see also Exercise 6.27).
[End of Exercise]
We can’t always find a basis of eigenvectors, but in the above examples, we were able to find a basis of Jordan chains. The remarkable thing about these Jordan bases, and the reason why this method should be considered a superior extension to diagonalizing a matrix, is that they always exist:
For any matrix , there is a Jordan basis for .
In other words, there is always a basis of consisting of Jordan chains for .
At the end of the next section is an algorithm for finding a Jordan basis.
Find a Jordan basis for .
[End of Exercise]