0

I was given the following question :

Let $V$ be a vector space of dimension $n$ over $\mathbb{C}$, and let $S: V \to V$ be a linear operator such that its minimal polynomial is $m(x) = x^n$. Prove that if a linear operator $T: V \to V$ commutes with $S$, then there exists a polynomial $p(x)$ such that $p(S) = T$.

Any ideas on how to approach this?

  • Since you put “Jordan normal form” yourself in the title, have you tried looking at what the Jordan normal form of $S$ could look like? – Jean Abou Samra Sep 19 '24 at 19:10
  • I did, its obviously $J_n(0)$, it was the first section of this question, but i don't quite understand how can i use that for the proof. i thought maybe it has somthing to do with the Jordan base, but got stuck. – Johann Carl Friedrich Gauß Sep 19 '24 at 19:16
  • OK, so this gives a basis $(x_1, …, x_n)$ such that $S(x_n) = x_{n-1}$, $S(x_{n-1}) = x_{n-1}$, …, $S(x_2) = x_1$, $S(x_1) = 0$. Then we have $S(T(x_1)) = T(S(x_1)) = T(0) = 0$, so $T(x_1) ∈ Ker(S)$. What can you conclude on $T(x_1)$? Now try to do something similar with $x_2$ then $x_3$ and you should see a pattern. – Jean Abou Samra Sep 19 '24 at 19:21
  • i still dont understand, from where am i supposed to get the polynomial? i just dont see it for some reason – Johann Carl Friedrich Gauß Sep 19 '24 at 19:38
  • If $P = a_0 + a_1 X + a_2 X^2 + … + a_{n-1} X^{n-1}$ is a polynomial, what does $P(J_n(0))$ look like? – Jean Abou Samra Sep 19 '24 at 19:42
  • I give up on this question.. if anyone wants to write an answer I would really appreciate it – Johann Carl Friedrich Gauß Sep 19 '24 at 21:06
  • Let $J$ be the Jordan form for $S$. Can you tell us what the rank is for $\big(J\oplus - J\big)$? Here $\oplus $ denotes the Kronecker sum. This amounts to deleting a few columns and counting ones on the diagonal afterward. The issue right now is you haven't shown any work for this actual problem of a first section of the problem mentioned in the comments, or told us the book in use or what you know as background knowledge and so on. Put differently this post is missing context and really should be closed for that reason. – user8675309 Sep 19 '24 at 21:46
  • the first section only requires to find the jordan form of $S$ which is very easy. there is no further contex that i didnt provide in the question. – Johann Carl Friedrich Gauß Sep 19 '24 at 21:52
  • and i have shown no work since i really dont understand the approach to this, i do understaand that is has to do somthing with the jordan base and applying $T$ on the elemnts from that base, than using the commutitve propity somhow, but i havent managed to understand how to get the polynomial from that – Johann Carl Friedrich Gauß Sep 19 '24 at 21:54
  • You as the OP always get tagged in comments but if there is more than one other person in these comments then others are not in general notified, so you need to tag them like @JohannCarlFriedrichGauß . Regarding context, please read through https://math.meta.stackexchange.com/questions/9959/how-to-ask-a-good-question/9960#9960 – user8675309 Sep 19 '24 at 22:30
  • I've closed this as a duplicate of a question about a more general situation, where the same conclusion holds. Of course if the degree of the minimal polynomial equals the dimension of the space, as is given here, then it equals the characteristic polynomial. – Marc van Leeuwen Sep 20 '24 at 11:54

2 Answers2

0

Let $(x_1, …, x_n)$ be a basis in which the matrix of $S$ is $J_n(0)$, i.e.,

$ \begin{pmatrix} 0 & 1 & 0 & … & 0 \\ \vdots & 0 & 1 & … & 0 \\ \vdots & \vdots & \ddots & \ddots \\ \vdots & \vdots & \vdots & \ddots & 1 \\ 0 & 0 & 0 & … & 0 \end{pmatrix} $

Then $S(x_n) = x_{n-1}$, $S(x_{n-1}) = x_{n-2}$, …, $S(x_2) = x_1$, $S(x_1) = 0$.

Let us find what form the matrix of $T$ in this basis takes. Since $S$ and $T$ commute, we have $S(T(x_1)) = T(S(x_1)) = T(0) = 0$, thus, by the form of $J_n(0)$, there is $a_0$ such that $T(x_1) = a_0 x_1$.

Now, we have $S(T(x_2)) = T(S(x_2)) = T(x_1) = a_0 x_1$. By the form of $J_n(0)$, there is $a_1$ such that $T(x_2) = a_0 x_2 + a_1 x_1$.

Continuing this inductively, we find that the matrix of $T$ takes the form

$ \begin{pmatrix} a_0 & a_1 & a_2 & … & a_{n-1} \\ 0 & a_0 & a_1 & … & a_{n-2} \\ \vdots & \vdots & \ddots & \ddots \\ 0 & 0 & … & a_0 & a_1 \\ 0 & 0 & … & 0 & a_0 \end{pmatrix} $

which is precisely the polynomial $a_0 + a_1 X + … + a_{n-1} X^{n-1}$ applied to $J_n(0)$.

0

as far as creating the Jordan form, when all the eigenvalues (and entries of the given matrix, should there be one) are integers, there is a simple backwards procedure.

Your operator being $T,$ we know $T^n = 0$ We also know that $T^{n-1} \neq 0$ because the minimal polynomial is $x^n.$ There is some vector with $T^{n-1} v \neq 0$

Give the name $c_n = v,$ that is $T^{n-1} c_n \neq 0.$ Next, let $c_{n-1} = T c_n,$ then $c_{n-2} = T c_{n-1} = T^2 c_n,$ and so on. Finally $c_1 = T^{n-1} c_n \neq 0.$ This $c_1$ is the only genuine eigenvector, as $Tc_1 = 0.$

In this basis, the matrix representing the operator is

$$ M = \begin{pmatrix} 0 & 1 & 0 & … & 0 \\ \vdots & 0 & 1 & … & 0 \\ \vdots & \vdots & \ddots & \ddots \\ \vdots & \vdots & \vdots & \ddots & 1 \\ 0 & 0 & 0 & … & 0 \end{pmatrix} $$

For $n=2$ write down a matrix $A$ with entries called $a,b,c,d$ and multiply with the matrix $M$ in both orders. If they commute, what do we know about $A?$

Same for $n=3$ and $A$ with entries $a,b,c,d,e,f,g,h,i.$

Will Jagy
  • 146,052