0

I have a symmetric full rank matrix M, and a set of vectors $v_i$ that span the whole space. I assume those vectors to be orthogonal (they form an orthogonal set). If two of those vectors $v_i$ and $v_j$ satisfy \begin{align} v_i^T M v_j = 0 \end{align} Then these vectors are called "$M$-orthogonal or orthogonal with respect to the bilinear-form induced by $M$ (see this question).

If this is true for the whole set, that means \begin{align} v_i^T M v_j = \delta_{ij} \omega_i \end{align} Does that then automatically mean that the $v_i$ are eigenvectors of $M$? This answer suggests that this is actually the case in the comments, but doesn't confirm it.

My reasoning is the following: Obviously $Mv_i$ is orthogonal to every other $v_j$, and since all the other $v_j$ span the complete space of vectors orthogonal to $v_i$, $Mv_i$ can only have a component in $v_i$ direction.

The problem here is: That would mean that if a complete set of vectors is $A$-orthogonal, it automatically is orthogonal. Does this make sense?

  • They are indeed eigenvectors for $M$, with eigenvalues $\lambda_i,$ but only when $\lambda_i\ne\lambda_j$ can we be sure that $v_i\perp v_j.$ – Anne Bauval May 15 '23 at 22:15
  • The answer is No. Do you know what matrix congruence is? Consider $V=\left[\begin{matrix}1 & -2\1 & 1\end{matrix}\right]$ and $M= \left[\begin{matrix}1 & 0\0 & 2\end{matrix}\right]$ then $V^TMV = 3M$ but the columns of $V$ are not eigenvectors of $M$. The fallacy in your reasoning comes from not explicitly writing out the algebra-- suppose $Mv_1 = \sum_{k=1}^n \alpha_k v_k$... your sentence seems to give a nod to left multiplication by $v_j^T$ but the $v_k$ need not form a mutually orthogonal set under the standard inner product so $v_j^T Mv_1 =0$ doesn't say much about the $\alpha_k$. – user8675309 May 15 '23 at 22:52
  • Consider the special case where $M=(VV^T)^{-1}$ for some square matrix $V$. Then the columns of $V$ are $M$-orthogonal because $V^TMV=I$. However, if the columns of $V$ are eigenvectors of $M$, we must have $MV=VD$ for some diagonal matrix $D$. Therefore $V^TV=D^{-1}$ is diagonal, meaning that the columns of $V$ must be mutually orthogonal with respect to the usual inner product on $\mathbb R^n$. So, if the columns of $V$ are linearly independent but not orthogonal to each other w.r.t. the usual inner product, you get a counterexample. – user1551 May 15 '23 at 23:25
  • @user1551 I wasn't precise in my question. When I said that the $v_i$ span the entire basis, I didn't write down, but also assumed that they are orthogonal. I added the information in the question. – Quantumwhisp May 16 '23 at 06:14
  • 1
    your post now says "I assume those vectors to be orthogonal (they form an orthogonal set)... The problem here is: That would mean that if a complete set of vectors is -orthogonal, it automatically is orthogonal." which is a non sequitor. The vectors are orthogonal under the standard inner product because you assumed them to be-- it really has nothing to do with being A-orthogonal. – user8675309 May 16 '23 at 15:35
  • @Quantumwhisp If you assume that the $v_j$s form an orthogonal basis with respect to the standard inner product, they clearly are eigenvectors of $M$: your condition that $v_i^TMv_j=\delta_{ij}\omega_i$ means that $Mv_j$ is orthogonal (with respect to the standard inner product) to $v_i$ for every $i\ne j$. Therefore $Mv_j$ must be parallel to $v_j$, i.e., $v_j$ is an eigenvector of $M$. – user1551 May 17 '23 at 07:23
  • @user8675309 yes I see that if I assume that the vectors are already orthogonal, then it doesn't follow. I was confused. In the end I can make the following conclusions. Either the vectors are orthogonal from the beginning. In that case, having them being also $M$-orthogonal, means that they are all eigenvectors of M. Or instead they are not orthogonal from the beginning. Then I also can't follow that they are eigenvectors. – Quantumwhisp May 17 '23 at 19:25

1 Answers1

1

You have the right idea. $Mv_j$ is orthogonal to all $v_j$ does imply that it must itself be a multiple of $v_i$. You can do the following calculations to prove this.

The vectors $v_1,...,v_n$ form a orthogonal basis of the vector space. In particular any $v$ is equal to its projection onto the space spanned by $v_1,...,v_n$. That is, $$ v = \sum_i \frac{\langle v,v_i\rangle}{\langle v_i,v_i\rangle}v_i.$$

In particular for $Mv_j$ we find

\begin{align*} Mv_j &= \sum_i \frac{\langle Mv_j,v_i\rangle}{\langle v_i,v_i\rangle}v_i \\ &= \sum_i \frac{\langle v_i,Mv_j\rangle}{\langle v_i,v_i\rangle}v_i \\ &= \sum_i \frac{v_i^TMv_j}{\langle v_i,v_i\rangle}v_i \\ &= \sum_{i} \frac{\delta_{ij}w_i}{\langle v_i,v_i\rangle}v_i \\ &= \frac{w_j}{\langle v_j,v_j\rangle}v_j \end{align*} So yes they are eigenvectors.

The problem here is: That would mean that if a complete set of vectors is A -orthogonal, it automatically is orthogonal. Does this make sense?

In your post you initially assumed that the set of vectors $v_i$ were both orthogonal and $M$-orthogonal. I'm not sure how this last statement is related to the question of eigenvectors.

Digitallis
  • 4,246