99

Let $ \sigma(A)$ be the set of all eigenvalues of $A$. Show that $ \sigma(A) = \sigma\left(A^T\right)$ where $A^T$ is the transpose matrix of $A$.

Matcha Latte
  • 4,665
  • 4
  • 16
  • 49
Zizo
  • 1,901

5 Answers5

155

The matrix $(A - \lambda I)^{T}$ is the same as the matrix $\left(A^{T} - \lambda I\right)$, since the identity matrix is symmetric.

Thus:

$$\det\left(A^{T} - \lambda I\right) = \det\left((A - \lambda I)^{T}\right) = \det (A - \lambda I)$$

From this it is obvious that the eigenvalues are the same for both $A$ and $A^{T}$.

Matcha Latte
  • 4,665
  • 4
  • 16
  • 49
fretty
  • 11,482
  • 1
  • 28
  • 39
  • 1
    Do they have same minimal polynomial ? – Rising Star Oct 06 '15 at 10:49
  • 14
    Any polynomial satisfied by $A$ is also satisfied by $A^T$ so yeah. – fretty Oct 06 '15 at 10:51
  • I suppose that @Invisible is saying that because $A^T$ is similar to $A$ considering a special similarity transformation on Jordan blocks. That way is possible to biject explicitly the eigenvalues and eigenvector of $A$ with $A^T$. The analysis follows this link: https://math.stackexchange.com/questions/94599/a-matrix-is-similar-to-its-transpose. Quite interesting to know that. I have never heard of that result in my linear algebra courses. – R. W. Prado Jul 30 '21 at 21:05
36

I'm going to work a little bit more generally.

Let $V$ be a finite dimensional vector space over some field $K$, and let $\langle\cdot,\cdot\rangle$ be a nondegenerate bilinear form on $V$.

We then have for every linear endomorphism $A$ of $V$, that there is a unique endomorphism $A^*$ of $V$ such that $$\langle Ax,y\rangle=\langle x,A^*y\rangle$$ for all $x$ and $y\in V$.

The existence and uniqueness of such an $A^*$ requires some explanation, but I will take it for granted.

Proposition: Given an endomorphism $A$ of a finite dimensional vector space $V$ equipped with a nondegenerate bilinear form $\langle\cdot,\cdot\rangle$, the endomorphisms $A$ and $A^*$ have the same set of eigenvalues.

Proof: Let $\lambda$ be an eigenvalue of $A$. And let $v$ be an eigenvector of $A$ corresponding to $\lambda$ (in particular, $v$ is nonzero). Let $w$ be another arbitrary vector. We then have that: $$\langle v,\lambda w\rangle=\langle\lambda v,w\rangle=\langle Av,w\rangle=\langle v,A^*w\rangle$$ This implies that $\langle v,\lambda w-A^*w\rangle =0$ for all $w\in V$. Now either $\lambda$ is an eigenvalue of $A^*$ or not. If it isn't, the operator $\lambda I -A^*$ is an automorphism of $V$ since $\lambda I-A^*$ being singular is equivalent to $\lambda$ being an eigenvalue of $A^*$. In particular, this means that $\langle v, z\rangle = 0$ for all $z\in V$. But since $\langle\cdot,\cdot\rangle$ is nondegenerate, this implies that $v=0$. A contradiction. $\lambda$ must have been an eigenvalue of $A^*$ to begin with. Thus every eigenvalue of $A$ is an eigenvalue of $A^*$. The other inclusion can be derived similarly.

How can we use this in your case? I believe you're working over a real vector space and considering the dot product as your bilinear form. Now consider an endomorphism $T$ of $\Bbb R^n$ which is given by $T(x)=Ax$ for some $n\times n$ matrix $A$. It just so happens that for all $y\in\Bbb R^n$ we have $T^*(y)=A^t y$. Since $T$ and $T^*$ have the same eigenvalues, so do $A$ and $A^t$.

  • 2
    For an explanation on the things I took for granted, I suggest you read these excellent lecture notes: https://www.dpmms.cam.ac.uk/study/IB/LinearAlgebra/2008-2009/bilinear-08.pdf –  Jul 24 '14 at 23:54
  • I have a questions: First, why did you assume surjectivity of A ? – Our Oct 05 '17 at 08:16
  • Plus, where did we used the fact that V is finite dimensional. (I'm only interested with the fact that $A$ and $A^*$ has the same eigenvalues.) – Our Oct 05 '17 at 10:00
  • @onurcanbektas Where have I used surjectivity? The finite dimension is used in my reference to construct $A^$. If you have $A^$ to start with, construction isn't required. I would have to double check whether $\lambda I-A^*$ is an automorphism (bounded) cause I don't have infinite dimensional facts at my fingertips. But you might. –  Oct 05 '17 at 13:12
  • Why do you write $⟨v,\lambda w⟩=⟨\lambda v,w⟩$? In a general case we have $⟨v,\bar\lambda w⟩=⟨\lambda v,w⟩$ and your computation should be corrected accordingly. – Dmitry Apr 14 '20 at 11:13
  • @Dmitry I use the $\langle\cdot,\cdot\rangle$ symbols to represent any non-degenerate bilinear form over any base field (which the complex dot product is not). But this is superfluous. The argument generalizes to show that the eigenvalues of $A^*$ would be the conjugates of the eigenvalues of $A$, which is a different result. –  Apr 14 '20 at 14:01
  • OK, I see, bilinear is the key word here. – Dmitry Apr 14 '20 at 15:48
23

$$ \operatorname{det}(A-tI) = \operatorname{det}((A-tI)^T) = \operatorname{det}(A^T-tI)$$ A matrix and its transpose have the same determinant. If you apply properties of transposition, you get that both $A$ and its transpose have the same characteristic polynomial.

ViktorStein
  • 5,024
alpha.Debi
  • 1,094
7

Here is another proof: Suppose that $v$ is an eigenvector of $A$ with eigenvalue $\lambda$, i.e. $Av = \lambda v$. Then $v^T A^T = (Av)^T = \lambda v^T$. This means that $v^T(A^T - \lambda I) = 0$. Thus, $v^T$ is a left-eigenvector of $A^T$. If $A^T - \lambda I$ was invertible then multiplying from the right with the inverse leads to $v=0$ which is a contradiction.

0

As per the well known definition of eigen value :if $Ax = \lambda{x}$, where $A \in R^{nxn}$, $\lambda \in R$, and $x \in R^{n}$ , then $\lambda$ is the eigen value of A and x is the eigen vector.

As per that, $\\Ax = \lambda{x} \\{\implies}(Ax)^T = (\lambda x)^{T} \\{\implies}x^{T}A^{T} = \lambda{x}^{T}$

Now pre-multiplying both the sides of above equation with $x$, one can have,

$xx^{T}A^{T} = \lambda{x}{x}^{T} \\{\implies}(xx^{T})A^{T}x = \lambda({x}{x}^{T})x$

Now pre-multiplying both the sides of above equation with $({x}{x}^{T})^{-1}$ one can have (provided $x\neq0$ non-trivial solution),

$A^{T}x = \lambda{x}$.

Hence, proved that the eigen values of $A$ and $A^{T}$ are the same.

  • I don't see clearly the invertibility of $xx^{T}$. Isn't it a one-rank matrix? Or what is the meaning of "pre-multiplying"? – Czylabson Asa May 04 '25 at 15:08