Especially starting in this section, and subsequently, we will usually consider matrices and vectors spaces with the field of the complex numbers, . The reason for doing so is the Fundamental Theorem of Algebra, which has two key consequences:
Every monic polynomial of degree is the product of linear factors ,
Every complex matrix has at least 1 eigenvalue.
We have seen several examples of matrices in which are not diagonalizable; in other words, for which does not have a basis consisting of eigenvectors. The following could be seen as a strategy to deal with this obstacle. Instead of considering only spaces of eigenvectors, we will consider a generalization of eigenvectors, as follows.
Let be a square complex matrix, and let be an eigenvalue of . Then the generalized eigenspace of index is:
Non-zero elements of are called generalized eigenvectors of .
For , we have .
For , we have which is the usual eigenspace for .
Let . It’s only eigenvalue is . Prove that:
and
If , and is an eigenvalue, prove that .
[End of Exercise]
Let’s find the generalized eigenspaces with respect to the eigenvalue , for the following matrices in :
(Solution:) For , let’s compute the powers of the matrix .
So, for any , multiplying by more copies will still give a matrix with zeros everywhere except the lower right entry. The generalized eigenspaces are the kernels of these matrices. So
for all
Similarly for the matrix , we compute the powers of the matrix :
.
So, for any , multiplying by more copies will still give the zero matrix, so . Now we find the kernels of the above matrices:
for all .
So, the dimensions of the generalized eigenspaces of , for the eigenvalue are , while the dimensions of the generalized eigenspaces for for the eigenvalue are .
The pattern in the previous example is that the generalized eigenspace dimension increases until a certain point, after which the dimensions stabilize. This is an example of a general phenomenon, stated in the following theorem.
Let be any matrix, and let .
implies
We already know that , we just need the reverse containment. Assume , in other words, . This implies , which means . Using the assumption of the theorem, , in other words, . This is the same as saying . So we have proved . Using the same argument for the next exponent, and the next, etc, we have proved the theorem. This argument could also be worded using the language of induction. ∎
Let be an eigenvalue of . If , then all generalized eigenspaces for are equal to each other.
This is just Theorem 6.21 applied to the case when . ∎
Let . For each eigenvalue of , find a basis for each generalized eigenspaces of .
(Solution:) First, we determine the eigenvalues via the characteristic polynomial.
So the eigenvalues are and .
: We compute the dimensions of the generalized eigenspaces as follows:
Since is an eigenvalue, we have , and so . But two of its rows are clearly linearly independent, and so by Theorem 4.25. This proves that . By the dimension theorem, , in other words .
This shows two rows of are linearly independent, and we can use that to see that . So . By the dimension theorem again, Now by Theorem 6.21 all of the generalized eigenspaces for are equal to each other:
for all . So, in this case, a basis for each generalized eigenspace is the vector .
: The computation is similar to the previous case:
This matrix is rank 2, and so , by the dimension theorem. Moreover, the eigenspace is
To determine the next generalized eigenspace, we compute:
Observe that the rows are scalar multiples of each other, and therefore the row space is 1-dimensional; in other words . So by the dimension theorem, . We can express the generalized eigenspace as the span of 2 vectors as follows:
The next generalized eigenspace is the kernel of the following matrix:
So for every . In particular,
for all .
Since the sequence is linearly independent (it consists of two vectors which are not multiples of each other) and spans this generalized eigenspace, it forms a basis.
To summarize: the generalized eigenspaces for the eigenvalue are of dimension , and they each have a basis . The generalized eigenspaces for the eigenvalue are of dimension , and the first one has a basis , while each of the others has a basis .
For each of the matrices in Exercise 6.15, in , and for each eigenvalue , compute the dimensions and find a basis of each generalized eigenspace for .
[End of Exercise]
In all of the above exercises and examples, the observant reader may have noticed that generalized eigenvectors for different eigenvalues are always linearly independent. This is always true (as stated in the following theorem), and the proof uses an induction argument which we omit.
Let , and assume are generalized eigenvectors for different eigenvalues. Then are linearly independent.
As we will see, the dimensions of the generalized eigenspaces will be used to deduce the Jordan normal form; no other information is needed. But if you want to find a specific change of basis matrix such that is in Jordan normal form, this is equivalent to finding a Jordan basis, which is the purpose of the next section.