3

Can someone help me?

I've been thinking about this question for a while and got stuck. At first I only found the Identity transformation ($I$) and the anti-Identity transformation ($-I$). But then I realized that every reflection is also an involution. The only relevant information I got about the transformation's matrix is that $A^{-1} = A$ and, of course, $(A - I)\cdot(A + I) = 0$.

Zev Chonoles
  • 132,937
  • 1
    I think (and I could be wrong) any idempotent matrix is also an involution, since $A^{-1}=A\implies{A}^3=A$. But of course, I'm not sure if that means all linear involutions are idempotent... – rurouniwallace Aug 03 '13 at 22:20
  • The problem is that if A is idempotent, it's singular (with the exception of the identity matrix). – Pedro Amorim Aug 03 '13 at 22:27
  • 2
    A reasonable start is to look for diagonal matrices that define involutions. (In particular, what can the eigenvalues be?) It's also not difficult to prove that an involution is diagonalizable, i.e., its eigenspaces span $E$. These observations pretty much answer the question. – Andrew D. Hwang Aug 03 '13 at 22:28
  • 1
    @Ataxaria and Anyone Else (for that matter): your comment raises a good point in and of itself, which is not directly addressed in the question, viz., just what is an involution. Unless Pedro Amorim, OP of this question, tells us to the contrary, I take it to be a linear map $A$ such that $A^2 = I$, which holds if and only if $A^{-1} = A$. Thus I believe that involutions must be invertable, which isn't true for general idempotents: $A^2 = A$ allows for the eigenvalue zero to occur. – Robert Lewis Aug 03 '13 at 22:34

3 Answers3

3

You are looking for an $n\times n$ matrix $A$ whose square is equal to the identity. This means that the matrix satisfies the equation $x^2-1$, so its minimal polynomial must be $x-1$, $x+1$ or $x^2-1$. In the case of the first two, we have the two matrices you've already found, that is $I$ and $-I$.

Now, let's suppose the minimal polynomial is $x^2-1$. Notice this factors completely over $\mathbb{R}$ into distinct linear factors, which shows that $A$ is diagonalizable. This means there is an invertible matrix $P$ such that:

$$PAP^{-1}=D$$

where $D$ is a diagonal matrices, whose diagonal elements are the eigenvalues of $A$. In our case, the eigenvalues are $1$ and $-1$ (the roots of the minimal polynomial). Since the Jordan form representation is unique up to permutation of the Jordan blocks, we only need to know the algebraic multiplicity of the eigenvalues, in other words, how many $1$'s and how many $-1$'s are on the diagonal of $D$. Suppose $D_i$ is the $n\times n$ matrix with $i$ $1's$ on the diagonal, where $0\le i\le n$. Notice that $D_0=-I$ and $D_n=I$. Then all such involutions $A$ have the form:

$$A=P^{-1}D_iP$$

Jared
  • 32,117
3

Geometrically, an involution $f:E \to E$ corresponds to a direct sum decomposition $E = E_1 \oplus E_{-1}$ into the eigenspaces of $f$.

In a bit more detail, if $E = V \oplus W$ is an arbitrary direct sum decomposition of $E$, there exists a unique involution $f:E \to E$ whose restriction to $V$ is the identity and whose restriction to $W$ is minus the identity.

Conversely, if $f:E \to E$ is an involution, then every vector $v$ in $E$ may be written $$v = \tfrac{1}{2}\bigl(v + f(v)\bigr) + \tfrac{1}{2}\bigl(v - f(v)\bigr),$$ and these summands are easily checked to be eigenvectors of $V$ with respective eigenvalues $1$ and $-1$. (This identity gives an explicit proof that $f$ is diagonalizable on $E$: Every vector in $E$ is a sum of eigenvectors of $f$.)

  • Could you provide an example where using this decomposition in your answer reveals something interesting? – Poitou-Tate May 18 '24 at 07:29
  • @Pont Every square matrix is uniquely the sum of a symmetric and skew-symmetric matrix. (Those are $+1$ and $-1$ eigenspaces of the transpose operator.) Every real-valued function of one variable whose domain has the form $[-a, a]$ for some real $a$ is uniquely the sum of an even function and an odd function. (Those are $+1$ and $-1$ eigenspaces of "domain reflection"--$Tf(x) = f(-x)$--on the space of all functions.) <> Is that the type of example you seek? – Andrew D. Hwang May 18 '24 at 12:49
  • D. Hang Hwang Can we apply this to this problem ? https://math.stackexchange.com/questions/1829332/prove-that-if-v-is-finite-dimensional-then-v-is-even-dimensional – Poitou-Tate May 19 '24 at 20:28
  • Let $V$ be a $\Bbb{R}$ vector space which has $f^2=-1$. Decomposet $V$ into $V^+$ and $V^-$ as you do. Do you think can we prove $V^+$ and $V^-$ has the same parity ? – Poitou-Tate May 19 '24 at 20:30
  • 1
    @Pont On one hand, $f^2=-1$ is a completely different situation than $f^2=1$: There are no real eigenvalues at all. On the other hand, complexifying $V$ (tensoring with $\mathbf{C}$ over the reals) does lead to a similar decomposition into $\pm i$ eigenspaces, and these do have the same dimension (e.g., because complex conjugation is a real-linear isomorphism between them). – Andrew D. Hwang May 19 '24 at 21:47
  • Thank you very much , I should have written $V^i,V^{-i}$. Could you elaborate why parity of dimension of $V^{i}$ and $V^{-i}$ are the same ? – Poitou-Tate May 19 '24 at 22:16
  • @Pont Does the answer here help? If not, maybe post a new question...? – Andrew D. Hwang May 19 '24 at 22:44
1

Hint: If $T=T^{-1}$ then $T^2-1=0$. So, the minimal polynomial of $T$ divides $x^2-1$, and thus is separable and splits over $\mathbb{R}$. So, $T$ is diagonalizable.

Alex Youcis
  • 56,595