6

I am struggling with the following question

Let $V$ be a finite dimensional vector space and $T:V\rightarrow V$ be a linear normal operator ($T^\ast T = TT^\ast$), and $W$ an invariant subspace of $T$ ($T(W)\subseteq W)$. Prove $W$ is also an invariant subspace of $T^\ast$.

The problem I have is characterizing something like $T^\ast w \in W$ when all that is given is in the language of inner products. I thought of maybe decomposing $T^\ast w = u+v$ where $u \in W,\ v\in W^\perp$, and showing $v=0$ by $\left <T^\ast w, v \right >=0$, but $T$ being normal does not help when there is "only one $T$" inside the inner product. Is multiplying both sides by $T$ any help? because then we can use normality but I don't know where it leads us.

I know this is true since using the unitary diagonalization, I can express $T^\ast$ as a polynomial in $T$ and from there it's easy ($W$ is $p(T)$ invariant regardless of the polynomial itself), but I would like to see a more fundamental solution.

Theorem
  • 3,048
  • On what space are you assuming $\ T\ $ to be defined? The result is true if $\ W\ $ is a closed subspace of Hilbert space, but not necessarily true if $\ W\ $ isn't closed. It's true, of course, if the space is finite-dimensional, because then all subspaces are closed. – lonza leggiera Jul 05 '21 at 10:12
  • $V$ is finite dimensional, sorry for not pointing it out. I am not familiar with the notion of closed subspaces or anything related to infinite dimensional functional analysis @lonzaleggiera – Theorem Jul 05 '21 at 10:13
  • By "polynomial" I mean the polynomial can be different for every $T$, not something uniform for all normal operators. That I think doesn't imply what you claim. @DavidC.Ullrich – Theorem Jul 06 '21 at 15:13
  • I forgot most of my basic complex analysis, so I can't point the gap between what I am saying and your perception, but what we learned is that if $T=\sum_k \lambda_k Q_k$ then since $Q_k$ is self adjoint, $T^\ast = \sum_k \lambda_k^* Q_k$ and each $Q_k$ is obtained by Lagrange interpolation polynomials of $T$, so in total $T^\ast$ is a polynomial in $T$ but the coefficients depend on $T$ and are probably not holomorphic in your context. This does suffice to my problem though, to prove $W$ is $T^\ast$ invariant. @DavidC.Ullrich – Theorem Jul 06 '21 at 15:30
  • It was discussed in the previous comments and already corrected in the question itself yesterday, but thank you for your suggestion :) @DavidC.Ullrich – Theorem Jul 06 '21 at 15:50

1 Answers1

4

$\newcommand{\Tr}{\mathrm{Tr}}$ $\newcommand{\diag}{\mathrm{diag}}$ Suppose $\dim(W) = k, 1 \leq k < n$ (where $n = \dim(V)$). Then there exists an orthonormal basis $\{\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n\}$ of $V$, where $\{\alpha_1, \ldots, \alpha_k\}$ is an orthonormal basis of $W$, such that \begin{align*} & T(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix}; \\ & T^*(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix}. \end{align*} Since $T$ is normal, we have \begin{align*} \begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix} \begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix} = \begin{pmatrix} A_{11}^* & 0 \\ A_{12}^* & A_{22}^* \end{pmatrix} \begin{pmatrix} A_{11} & A_{12} \\ 0 & A_{22} \end{pmatrix}. \end{align*} Comparing the northwestern blocks of the above equation, we obtain $$A_{11}A_{11}^* + A_{12}A_{12}^* = A_{11}^*A_{11},$$ which implies $\Tr(A_{11}A_{11}^*) + \Tr(A_{12}A_{12}^*) = \Tr(A_{11}^*A_{11})$, hence $\Tr(A_{12}A_{12}^*) = 0$. This means $A_{12} = 0$. Therefore \begin{align*} & T(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\diag(A_{11}, A_{22}); \\ & T^*(\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n) = (\alpha_1, \ldots, \alpha_k, \alpha_{k + 1}, \ldots, \alpha_n)\diag(A_{11}^*, A_{22}^*). \end{align*} Therefore $W$ is also $T^*$-invariant. This completes the proof.

Zhanxiong
  • 15,126
  • Thank you! I think we could also choose in the first place $\alpha_{1},\dots,\alpha_k$ that diagonalizes $T|{W}$ and $\alpha{k+1},\dots,\alpha_n$ that diagonalizes $T|{W^\perp}$ (both are normal), and we know $A{12}=0$ from the first place? or is there anything circular in that I am missing (probably)? (Edit: Actually this assumes $W^\perp$ is $T$ invariant...) – Theorem Jul 05 '21 at 17:49
  • @Theorem If you just need to prove invariant space, block-diagonalization (as shown in the above answer) would be sufficient instead of full diagonalization. This is also in line with your intent to refrain using advanced machinery/results as much as possible. – Zhanxiong Jul 05 '21 at 21:23