2

This is related to the Cartan-Dieudonne theorem.

In a Clifford algebra over an $n$-dimensional real vector space $V$ with quadratic form $a\cdot a\geq0$ for all $a\in V$, can any product of vectors $a_1,a_2,\cdots,a_m\in V$ be written as

$$a_1a_2\cdots a_m=a_1'a_2'\cdots a_k'$$

for some $k\leq n$ and $a_1',a_2',\cdots,a_k'\in V$?

Due to the grading on the algebra, necessarily $k\equiv m\bmod2$. It suffices to prove that a product of $n+1$ vectors can be reduced to a product of $n-1$ vectors.


For $n=1$ dimension, a product of $2$ vectors is a scalar: $ab=a\cdot b+a\wedge b=a\cdot b+0$, since $a$ and $b$ are linearly dependent. The empty product of $0$ vectors is $1$, not necessarily $a\cdot b$; but we can allow the product to have a scalar in front, which doesn't make a difference in higher dimensions because it can be absorbed into one of the vectors.

For $n=2$, a product of $3$ vectors is

$$abc=(a\cdot b+a\wedge b)c=(a\cdot b)c+(a\wedge b)\cdot c+(a\wedge b)\wedge c$$

$$=(a\cdot b)c+a(b\cdot c)-(a\cdot c)b+a\wedge b\wedge c$$

$$=(a\cdot b)c-(a\cdot c)b+(b\cdot c)a+0$$

which is a sum of vectors and thus a single vector $a'$.

Now let's assume that $V$ is positive-definite, and that we've proven the result for dimensions less than $n$ for induction. We have $n+1$ vectors $a_i$ in $n$ dimensions, which are thus linearly dependent:

$$c_1a_1+c_2a_2+\cdots+c_{n-1}a_{n-1}+c_na_n+c_{n+1}a_{n+1}=0$$

for some scalars $c_i$ not all $0$. Use this to define a vector

$$v=c_1a_1+\cdots+c_{n-1}a_{n-1}=-c_na_n-c_{n+1}a_{n+1}.$$

If $a_n,a_{n+1}$ are dependent, then their product is a scalar, and $a_1\cdots a_{n-1}a_na_{n+1}=a_1\cdots a_{n-1}(a_n\cdot a_{n+1})$, and the problem is solved. So assume $a_n,a_{n+1}$ are independent.

Similarly, if $a_1,\cdots,a_{n-1}$ are dependent, then they're contained in a subspace of dimension $\leq n-2$, and by induction we've solved the problem in that case: $a_1\cdots a_{n-1}=a_1'\cdots a_{n-3}'$, so $a_1\cdots a_{n+1}=a_1'\cdots a_{n-3}'a_na_{n+1}$ is a product of $n-1$ vectors. So assume $a_1,\cdots,a_{n-1}$ are also independent.

If $v=0$ then these independences give $c_1=\cdots=c_{n-1}=0$ and $c_n=c_{n+1}=0$, a contradiction. Thus $v\neq0$, and by positive-definiteness $v$ has an inverse $(v\cdot v)^{-1}v$. Then we can write

$$a_1\cdots a_{n-1}a_na_{n+1}=(a_1\cdots a_{n-1}v^{-1})(va_na_{n+1});$$

the factor on the left is a product of $n$ vectors in $n-1$ dimensions, which by induction is a product of $n-2$ vectors; and the factor on the right is a product of $3$ vectors in $2$ dimensions, which is a single vector; so we get a product of $n-2+1=n-1$ vectors. This completes the proof.

Is the result still true when $V$ is only positive-semidefinite?

mr_e_man
  • 5,986

1 Answers1

2

No, it's not true. Take $V$ to have signature $+00$, that is, an orthogonal basis with $e_1\!^2=1,\,e_2\!^2=0,\,e_3\!^2=0$. Then the product of four unit vectors

$$e_1(e_1+e_2)(e_1+e_3)(e_1-e_2+e_3)=1+e_2e_3$$

cannot be written as a product of two vectors; their wedge product would have to be $e_2e_3$, so they'd be in the span of $e_2,e_3$, but then their dot product would be $0$, not $1$.

This phenomenon requires a 2D degenerate subspace. If $V$ has signature of the form ${++}\cdots{+0}$, with just a 1D degenerate subspace, the result is true.


Consider two vectors $a,b$. If $a$ is null ($a^2=0$) and $b$ is not, then we can move the null vector to the right in the product:

$$ab=b(b^{-1}ab)=ba'$$

where $a'^2=a^2=0$.

If $a$ and $b$ are both null, then they're in the same 1D degenerate subspace, and their product is $0$.

So in the product $a_1a_2\cdots a_{n+1}$ any null factors can be moved to the right side, and the product vanishes if there's more than one such factor.

Suppose there's exactly one null factor $a_{n+1}\propto e_n$. All of the other vectors can be written in the form $a_i=a_i^++a_i^\circ e_n$ where $a_i^+$ is in the span of the other basis vectors $e_1,\cdots,e_{n-1}$ and $a_i^\circ$ is a scalar. Then $a_ia_{n+1}=a_i^+a_{n+1}$ and, anticommuting $a_{n+1}$ past the near factors as necessary to reach the far factors, the product simplifies to

$$a_1a_2\cdots a_na_{n+1}=(a_1^+a_2^+\cdots a_n^+)a_{n+1}.$$

The expression in parentheses is a product of $n$ vectors in a positive-definite space of dimension $n-1$, which the OP showed is a product of $n-2$ vectors. Including $a_{n+1}$ makes a product of $n-1$ vectors.

So in what follows we can assume that there are no null factors; that each $a_i^+\neq0$.


Consider two vectors $a,b$, neither null. As before write $a=a^++a^\circ e_n$, where $a^+$ is in the span of $e_1,\cdots,e_{n-1}$ and $a^\circ$ is a scalar, and $b=b^++b^\circ e_n$. We desire to have $a=a^+$, that is, $a^\circ=0$; if $a^\circ\neq0$ then we'll try to find vectors $x,y$ such that $ab=xy$ and $x^\circ=0$. But since $a$ and $b$ are invertible, $x$ and $y$ are also invertible, so $ab=xy$ is equivalent to $x^{-1}ab=y$. And since $y$ is an arbitrary vector, this is equivalent to $x^{-1}ab$ having a vanishing trivector part: $x^{-1}\wedge a\wedge b=0$. This is further equivalent to $x$ being in the span of $a,b$ (assuming they're independent; but if they're dependent then trivially $ab=a\cdot b=\big((a\cdot b)e_1\big)e_1$). Thus, for some scalars $c_1,c_2$,

$$x=c_1a+c_2b=(c_1a^++c_2b^+)+(c_1a^\circ+c_2b^\circ)e_n$$

and the coefficient of $e_n$ must vanish. This has a solution

$$x=b^\circ a-a^\circ b=b^\circ a^+-a^\circ b^+$$

which is non-zero because $a^\circ\neq0$; hence $x$ is invertible as required.

So in the product $a_1a_2\cdots a_{n+1}$ any pairs $a_ia_{i+1}$ with $a_i^\circ\neq0$ can be rewritten with $a_i'^\circ=0$. This leaves only $a_{n+1}'$ possibly not in that form. Now we have, as before, a product of $n$ vectors $a_1'a_2'\cdots a_n'$ in a positive-definite space of dimension $n-1$, which reduces to a product of $n-2$ vectors, and the remaining $a_{n+1}'$ makes a product of $n-1$ vectors.

mr_e_man
  • 5,986