I'm trying to build intuition for the orientation of a set of vectors independent of the well-known definition of the determinant. My intuition wants to go something like this: any set of vectors can be transformed into the standard basis by a sequence of elementary transformations: shears and positive rescalings should not change the sign of the set of vectors, swaps and negatives rescaling change the sign of the set of vectors.
Let $v_1, \ldots, v_N \in \mathbb{R}^N$. Let $V$ be the matrix whose columns equal $v_i$. I would like to define a sign function which maps a square matrix $V$ to the set $\{-1, 0, +1\}$. I'll give two candidates for a sign function which I'll call $S_1(V)$ and $S_2(V)$. The hope is that $S_{1,2}(V) = \text{sgn}(\text{det}(V))$, but I would like to establish this without relying on any well-known formulas for $\text{det}(V)$ such as
$$ \text{det}(V) = \sum_{\sigma \in S_N} v_{1, \sigma(1)} \ldots v_{N, \sigma(N)} = \sum_{i_1, \ldots, i_N=1}^N \epsilon_{i_1\ldots i_N} v_{1, i_1}\ldots v_{N, i_N}\ $$
I define $S_{1,2}(V)$ as follows.
If $V$ is not invertible then $S_{1,2}(V) = 0$.
If $V$ is invertible we proceed as follows. There are three types of elementary matrices.
- $E^1_i(c)$ is the identity but with $c$ at the $(i,i)$ position instead of 1.
- $E^2_{ij}(c)$ is the identity but with $c$ at the $(i, j)$ position instead of 0.
- $E^3_{ij}$ swaps rows $i$ and $j$.
If $V$ is an elementary matrix then if $V=E_i^1(c)$ then let $S_{1,2}(V) = \text{sgn}(c)$, if $V=E_{ij}^2(c)$ then $S_{1,2}(V)=+1$, if $V=E_{ij}^3$ then $S_{1,2}(V)=-1$
If $V$ is invertible but not elementary my definitions for $S_1(V)$ and $S_2(V)$ differ.
For $S_1(V)$, if $V$ is invertible but not elementary then $V$ can be uniquely decomposed into elementary row matrices using a well-defined Gauss-Jordan elimination procedure:
$$ V = E_1\ldots E_K $$
In this case let
$$ S_1(V) = S_1(E_1)\ldots S_1(E_K) $$
For second candidate we define $S_2$ to have the property that
$$ S_2(AB) = S_2(A)S_2(B) $$
For $S_1$ it is clearly that $S_1(V)$ is a well-defined function for all matrices, but it is not clear to me how to prove $S_1(AB)=S_1(A)S_1(B)$.
For $S_2$ it is not clear to me how to prove such a function exists. For example, $V$ can be decomposed multiple different ways into elementary matrices.
$$ V = E_{1,1}\ldots E_{1,k_1} = E_{2,1}\ldots E_{2, k_2} $$
How to prove
$$ S_2(E_{1,1})\ldots S_2(E_{1,k_1}) = S_2(E_{2,1})\ldots S_2(E_{2, k_2})? $$
It is clear that if $S_2$ exists that $S_2(V) = S_1(V)$ and if we relax the constraint that we avoid explicit expersions for $\text{det}(V)$, then it is clear that all of the above can be proven and $S_1(V)=S_2(V) = \text{sgn}(\text{det}(V))$, but this is of course what I'm trying to avoid.
So my questions are:
- How to show that $S_1(AB)$ satisfies $S_1(A)S_1(B)$?
- How to show that $S_2(V)$ exists?
- Is there another way to get at what I am seeking?