10

$\newcommand{\GL}{\operatorname{GL}}$

Let $V$ be a real $n$-dimensional vector space. For $1<k<n$ we have a natural representation of $\GL(V)$ via the $k$ exterior power:

$\rho:\GL(V) \to \GL(\bigwedge^kV)$, given by $\rho(A)=\bigwedge^k A$. I am trying to show $\rho$ is an irreducible representation. Let $0\neq W \le \bigwedge^kV$ be a subrepresentation. If we can show $W$ contains a non-zero decomposable element, we are done.

Indeed, suppose $W \subsetneq \bigwedge^kV$. Then, there exist a decomposable element $\sigma=v_1 \wedge \dots \wedge v_k \neq 0$, such that $\sigma \notin W$. We assumed $W$ contains a non-zero decomposable element $\sigma'=u_1 \wedge \dots \wedge u_k \neq 0$. Define a map $A \in \GL(V)$ by extending $u_i \to v_i$. Then

$$\rho(A) (\sigma')=\bigwedge^k A(u_1 \wedge \dots \wedge u_k)=\sigma \notin W,$$

while $\sigma' \in W$, con

So, the question reduces to the following: Why does every non-zero subrepresentation contain a non-zero decomposable element?

I asked an even more naive question here-whether or not every subspace of dimension greater than $1$ contains a non-zero decomposable element?

Asaf Shachar
  • 25,967
  • 5
    A nice conceptual way to work with this is to first decompose the representation as a module for the group of diagonal matrices (into so-called weight spaces). Then note what happens to the weight of a vector in one of these subspaces when one acts by suitable upper triangular unipotent matrices. – Tobias Kildetoft Jan 03 '19 at 09:09
  • Thanks. Unfortunately, I really know barely nothing about the machinery of representation theory. Can you please elaborate on this or give me a reference? (I don't know what a weight of a vector is, and naive googling only found something in the context of representations of Lie algebras, not Lie groups). – Asaf Shachar Jan 03 '19 at 09:17
  • The definition is essentially the same. The representation decomposes as a sum of $1$-dimensional subspaces, and a vector in such a subspace will be acted on via a scalar. This scalar depends on the element acting, giving a linear character of the subgroup of diagonal matrices, and this character is what is called the weight of the vector. It may be a bit much to get into if none of this is familiar, but I would still advice you to try writing it up explicitly for $k=1$ when $\dim(V) = 2$ to get a feel for what happens. – Tobias Kildetoft Jan 03 '19 at 09:43
  • A reference that answers your question is problem 7 on page 340 of Knapp's book "Lie groups - beyond an introduction". – the_lar Nov 27 '20 at 20:35
  • @the_lar Thank you; Is there a full proof there? our an outline of a proof? or just the statement? – Asaf Shachar Nov 29 '20 at 08:27
  • @AsafShachar well it is a problem so no full solution. But there are hints which are very suggestive if you read that section. – the_lar Nov 30 '20 at 00:48

2 Answers2

8

Pick a basis $e_1, \dots e_n$ of $V$ so that we can identify $GL(V)$ with $GL_n(F)$ (we'll start out working with an arbitrary base field $F$ and then restrict $F$ later). Write $T$ for the subgroup of $GL_n(F)$ consisting of diagonal matrices. An element of $T$ consists of some diagonal elements $(t_1, \dots t_n)$ and acts on $\Lambda^k(V)$ by sending $e_i$ to $t_i e_i$, then extending multiplicatively.

What this means is that each pure tensor $e_{i_1} \wedge e_{i_2} \wedge \dots \wedge e_{i_k} \in \Lambda^k(V)$ is a simultaneous eigenvector for every element of $T$; said another way, it spans a $1$-dimensional (hence simple) subrepresentation of $\Lambda^k(V)$, considered as a representation of $T$. (These are the "weight spaces" of this representation.) Since $\Lambda^k(V)$ is the direct sum of these $1$-dimensional subspaces, it follows that $\Lambda^k(V)$ is semisimple as a representation of $T$.

The significance of semisimplicity is that any $GL(V)$-subrepresentation of $\Lambda^k(V)$ is also a $T$-subrepresentation, and subrepresentations of semisimple representations are semisimple; they must also have the same simple components, in the same or smaller multiplicities. Moreover, if $F$ is any field except $\mathbb{F}_2$ (over $\mathbb{F}_2$, unfortunately, $T$ is the trivial group), the different $1$-dimensional representations above are all nonisomorphic. The conclusion from here is that any $GL(V)$-subrepresentation of $\Lambda^k(V)$ must be a direct sum of weight spaces.

But now we're done (again, for any field $F$ except $\mathbb{F}_2$), for example because $GL(V)$ acts transitively on these weight spaces.

Qiaochu Yuan
  • 468,795
  • Thanks. Can you please elaborate on your use of the fact that the different $1$-dimensional representations are all nonisomorphic? How do you conclude from that that any $GL(V)$-subrepresentation is a direct sum of some of the $1$-dimensional weight spaces? – Asaf Shachar Jan 08 '19 at 11:38
0

Consider a basis $e_1$, $\ldots$, $e_n$ of $V$.

There exists an $n\times n$ diagonal matrix $D=\operatorname{diag}(t_1, \ldots, t_n)$ such that all of the products $t_A\colon = \prod_{i \in A} t_i$ are distinct, for $A\subset \{1,\ldots, n\}$, $|A|=k$ (no problem if the field $F$ of definition is infinite).

Note we have $$(\wedge^k D) (e_{i_1} \wedge \cdots \wedge e_{i_k}) = t_{i_1} \cdots t_{i_k} e_{i_1} \wedge \cdots \wedge e_{i_k}$$ that is $$(\wedge^k D) e_A = t_A \cdot e_A$$

Now if we have $\sum_{|A|=k} c_A e_A \in W$ an invariant subspace for $\wedge^k(D)$, then we get all $c_A e_A \in W$ ( just apply $\wedge^k(D)$ several times, get a big Vandermonde, and solve a system). That should do it.

orangeskid
  • 56,630