is not an eigenvalues of because the eigenvalues of are
Using the inductive hypothesis on the vector space and operator :
If we can show that for all then we are done.
We know because if you're in the left set, you're in so applying to those items is equivalent to and thus we are done.
For , suppose where and . Because has it's own decomposition. Also (from inductive hypothesis):
where and thus (the case). We have:
where now each (notice the change from ). All the 's are linearly independent since they are all eigenvectors of different values. Therefore, is LI.
Thus for all we have and . This suggests and thus we are done.
(b) Using the definition of :
Where the right side is a polynomial operator , so then it's -invariant.
(c) Clear by definition, because then so then:
showing nilpotency. Notice that it doesn't mean that is nilpotent. The restriction is important. \end{proof}
multiplicity of an eigenvalue
For , the multiplicity of an eigenvalue is .
Note
We've talked about algebraic and geometric multiplicity of an eigenvalue in Linear I. Here the definition above is for the algebraic version. For the geometric one, it's defined as:
Fact
The proof from this comes from (a) of the main theorem we looked at, using the fact that direct sums add up in dimension.