Find the characteristic polynomial and the minimal polynomial for the operator where:
\begin{proof}
Since is upper triangular, then all the eigenvalues are on the diagonal of the matrix. As such, then is the only eigenvalue for , and furthermore by HW 8 - Operator Decomposition, Characteristic and Minimal Polynomials#11 (TODO) the multiplicity of each eigenvalue is precisely the number of times it appears on the diagonal. Thus is of multiplicity . Thus:
For the minimal polynomial we have the choices of , but we know that is the only viable option since:
and only holds true when because is nilpotent where only while . The math of the matrices of raised to various powers verifies this:
Thus, . \end{proof}
2
Question
Find the characteristic polynomial and the minimal polynomial of the operator where:
\begin{proof}
Using the same reasoning about eigenvalues mentioned above, clearly:
However we have to go through and see of all for and see the lowest that holds up:
Thus this implies that . \end{proof}
3
Question
Suppose is nilpotent. Prove that the minimal polynomial of is where is the length of the longest consecutive string of 1's that appears on the line directly above the diagonal in the matrix of with respect to any Jordan basis for .
\begin{proof}
Let be any Jordan basis for , so:
Now because we are considering the basis , then the length of each string of consecutive 1's are each of the 's in the equation above. That's because if we consider the rightmost vector in each "cycle" of the basis, then we start at some , and because of the form of the 1's on the upper diagonal of then:
Notice then that:
Now if we apply to both sides again that implies:
Now if is the longest length of consecutive 1's on the upper diagonal, applying this logic shows that that's equivalent to the largest that we find. Hence . In that case, then:
for some selected where . It's important to note that by definition here then .
We know that because I can choose vector up above, where by definition:
So that eliminates options . Now all we have to do is check that by checking that for all that . First, keep in mind that by the definition of a basis using as that basis. Therefore:
so since was arbitrary, then showing that as desired. \end{proof}
4
Question
Suppose and is a basis of that is a Jordan basis for . Describe the matrix of with respect to the basis obtained by reversing the order of the 's.
\begin{proof}
For clarity, let's consider this case for a basis for any arbitrary matrix . By the way we construct our matrix, we have the relationship:
is equivalent to saying:
so notice that if I swap , ... for all the vectors in , creating a new basis , then this is equivalent to starting with the bottom equation for our matrix, then working our way up:
which is equivalent to having the matrix:
Notice though that this is just rotating the matrix clockwise!
Therefore, to describe this for any matrix where , then we start with a Jordon Form of the matrix:
where each follows:
Thus, using we get:
where each is just the rotation of the above description of just rotated:
\end{proof}
5
Question
Suppose and is a basis of that is a Jordan basis for . Describe the matrix of with respect to this basis.
If is the matrix for the basis then, by the definition that for some :
But notice that:
Therefore we can derive the matrix version for :
Therefore, now each is of the form of the matrix above. \end{proof}
9.A: Complexification
2
Question
Verify that if is a real vector space and , then .
\begin{proof}
(additivity): Let (so ). Then:
(homogeneity): Let and . Then:
Thus . \end{proof}
3
Question
Suppose is a real vector space and . Prove that is LI in iff is LI in .
\begin{proof}
(): Suppose is LI in meaning that where . Consider when :
Since each equals then:
so since then since we have LI in (see above) then for all then showing LI in .
(): Suppose is LI in , so then for constants . Consider the sum:
for . Clearly:
where . Expanding the sum:
We can equate just the real and imaginary components to get:
Thus then for all then showing LI on . \end{proof}
4
Question
Suppose is a real vector space and . Prove that spans iff spans .
\begin{proof}
From HW 9 - Jordon Form, Complexification#3 we have that both and would have be LI in their respective vector spaces. Since that implies that both are basis of both (as an equivalence), so clearly they must span their respective vector spaces. \end{proof}
5
Question
Suppose that is a real vector space and . Show that and for all .
\begin{proof}
Let be arbitrary, so . Notice:
Since was arbitrary then immediately follows.
Similarly, for :
thus since was arbitrary. \end{proof}
6
Question
Suppose is a real vector space and . Prove that is invertible iff is invertible.
\begin{proof}
We'll prove the opposite, that is not invertible iff is not invertible.
is not invertible iff is an eigenvalue of . Since then that's iff is an eigenvalue of iff is not invertible. \end{proof}
7
Question
Suppose is a real vector space and . Prove that is nilpotent iff is nilpotent.
\begin{proof}
(): is nilpotent iff . Consider an arbitrary , so then:
so since was arbitrary then iff is nilpotent.
: is nilpotent iff . Consider an arbitrary . Consequently , so then:
So clearly iff is nilpotent. \end{proof}
8
Question
Suppose and are eigenvalues of . Prove that has no nonreal eigenvalues.
\begin{proof}
Clearly since then if then and thus since shares all real eigenvalues of , then would only have has eigenvalues which are all real. A similar arguments works if instead and .
Clearly the only situation we have a different eigenvalue is if . Then if we have a new eigenvalue then since the sum of the dimensions must equal 3 in this case. But notice that cannot be complex, since for contradiction if it was then that implies that the multiplicity of for would have to equal that for of , which implies that the sum of the dimensions of the generalized eigenspaces would be which would imply which is a contradiction.
Thus , so then only are the eigenvalues for which are all real, as desired. \end{proof}
15 (can i make the argument of invariance near the end?)
Question
Suppose is a real vector space and has no eigenvalues. Prove that every subspace of invariant under has even dimension.
\begin{proof}
Let be a subspace of invariant under . We'll show must be even. To do this, assume for contradiction that it's odd. Then the basis spans , namely:
it's composed of two subspaces . Here is just and . Namely:
which comes directly from the fact that we split the basis vectors to create the subspaces and . Notice that since is invariant under then by HW 5 - Polynomials, Invariants, and Eigenvectors#4 then must be invariant under .
Notice though that , so since is invariant under then which only happens if implying that is an eigenvector of . But that's a contradiction as has no eigenvalues.
Therefore, we must have is odd, and since was arbitrary, it holds for all subspaces of invariant under . \end{proof}
19 (almost over!)
Question
Suppose is a real vector space with and is such that . Prove that has at most two distinct eigenvalues and that has no non-real eigenvalues.
\begin{proof}
We still have it that:
As such because then we must have a strict subset near the end.
But because if we have equality in our subset chain that implies the following must be equal, and that would contradict , then the ones prior must be strict subsets as well:
Applying the dimensionality argument shows that:
This implies that:
Thus clearly . Thus since then can only be of dimension or . If it's then it's the only eigenvalue since then:
If instead we have then it's only possible that we have be of dimension 1:
Thus it's impossible to add any more different distinct eigenvalues, hence only a maximum of two eigenvalues are allowed (namely 0 and some ).
This shows that has at most two distinct eigenvalues. We'll now show that has no non-real eigenvalues. Notice that we have and .
For both cases, if is an eigenvalue for that implies that is an eigenvalue for .
In the former case, clearly and share real eigenvalues, so then would be an eigenvalue of \end{proof}