2

Update: I've now posted this question, which, if answered with a proof, would prove the question in this post.


Let $M,D\in\mathcal{M}_n(\mathbb{R}),$ where $D$ is diagonal and positive definite, and let $\mathcal{A}_-$ be the set of all matrices whose eigenvalues all have negative real part.

As you probably know, $M$ can be written as the sum of a symmetric and an anti-symmetric part, i.e., $M=M_S+M_A=\frac{1}{2}(M+M^T)+\frac{1}{2}(M-M^T).$

I'd like to show that if $M_S$ is negative definite (ND), then $H=DM\in\mathcal{A}_-.$

Simulations support it, but I don't know how to prove it.


For what it is worth, I can prove that

  1. $DM_S$ has all real eigenvalues and is in $\mathcal{A}_-$ iff $M_S$ is ND,
  2. $DM_A$ is on the boundary of $\mathcal{A}_-$ (all its eigenvalues have zero real part), and
  3. $M\in\mathcal{A}_-$ if $M_S$ is ND.

(Just an observation: 1. and 2. gives that $\mathrm{tr}(H)=\mathrm{tr}(DM_S),$ which is negative if $M_S$ is ND.)

Ideas that don't work:

  1. If $DM_S$ and $DM_A$ commuted, their eigenvalues would sum to produce the new set of eigenvalues, but unfortunately they don't.
  2. This question somewhat builds upon my previous question, that shows that if we can prove that the symmetric part $H_S$ of $H$ is ND, then we have the desired $H\in\mathcal{A}_-.$ However, I've found counter-examples of $H_S$ being ND when $M_S$ is ND, so this cannot be the way.
  3. $M\in\mathcal{A}_-$ is not sufficient for $H\in\mathcal{A}_-$ (I've again found counter-examples).
  4. If $x^THx<0$ for all nonzero $x\in\mathbb{R}^n,$ then $H\in\mathcal{A}_-,$ but unfortunately it doesn't go the other way, and I've yet again found counter-examples (i.e., I've found $H\in\mathcal{A}_-$ with a ND $M_S$ where $x^THx>0$).

I feel like this should be relatively straight-forward with the above puzzle pieces, but I'm not seeing how to put them together at the moment, so any help would be much appreciated.

1 Answers1

2

$DM$ is similar to $D^{1/2}MD^{1/2}$. If $v$ is a unit eigenvector for an eigenvalue $\lambda$ of $D^{1/2}MD^{1/2}$, then $\Re(\lambda)=\frac12(\lambda+\lambda^\ast)=\frac12v^\ast D^{1/2}(M+M^\ast)D^{1/2}v=v^\ast D^{1/2}M_SD^{1/2}v<0$.

user1551
  • 149,263
  • With $^*$ do you mean the conjugate transpose? – Bobson Dugnutt Oct 04 '17 at 14:52
  • Thanks. You're using that $D^{1/2}M_SD^{1/2}$ is negative definite in the last step, right? How do you know that? – Bobson Dugnutt Oct 04 '17 at 14:59
  • Ah, we can use Sylvester's Inertia Law to say that $D^{1/2}M_SD^{1/2}$ is negative definite if $M_S$ is. Thanks for the answer, the proof was much simpler than what I had feared! – Bobson Dugnutt Oct 04 '17 at 15:25
  • 1
    @Lovsovs You can indeed use Sylvester's law of inertia to prove that $D^{1/2}M_SD^{1/2}$ is negative definite, but more fundamentally, since $M_S$ is negative definite, $u^\ast M_Su<0$ for every nonzero vector $u$. So, if $P$ is an invertible matrix and $v$ is a nonzero vector, by putting $u=Pv$, we get $v^\ast P^\ast M_SPv<0$. This is true in particular when $P=P^\ast=D^{1/2}$. – user1551 Oct 04 '17 at 15:35
  • Ah, yes, that's a good point. Two final questions: Can we always define a left eigenvector to be the transpose of the right eigenvector, or is this only in the case of symmetric matrices? Also, in the present case, can we say for sure that all eigenvectors are real, i.e., $v\in\mathbb{R}^n$? – Bobson Dugnutt Oct 04 '17 at 15:45
  • @Lovsovs For a general real matrix, no. In fact, $M$ may not possess any real eigenvalue/eigenvector at all, such as when $M=\pmatrix{-1&-1\ 1&-1}$. – user1551 Oct 04 '17 at 16:03