4

I'm trying to build intuition for the orientation of a set of vectors independent of the well-known definition of the determinant. My intuition wants to go something like this: any set of vectors can be transformed into the standard basis by a sequence of elementary transformations: shears and positive rescalings should not change the sign of the set of vectors, swaps and negatives rescaling change the sign of the set of vectors.

Let $v_1, \ldots, v_N \in \mathbb{R}^N$. Let $V$ be the matrix whose columns equal $v_i$. I would like to define a sign function which maps a square matrix $V$ to the set $\{-1, 0, +1\}$. I'll give two candidates for a sign function which I'll call $S_1(V)$ and $S_2(V)$. The hope is that $S_{1,2}(V) = \text{sgn}(\text{det}(V))$, but I would like to establish this without relying on any well-known formulas for $\text{det}(V)$ such as

$$ \text{det}(V) = \sum_{\sigma \in S_N} v_{1, \sigma(1)} \ldots v_{N, \sigma(N)} = \sum_{i_1, \ldots, i_N=1}^N \epsilon_{i_1\ldots i_N} v_{1, i_1}\ldots v_{N, i_N}\ $$

I define $S_{1,2}(V)$ as follows.

If $V$ is not invertible then $S_{1,2}(V) = 0$.

If $V$ is invertible we proceed as follows. There are three types of elementary matrices.

  • $E^1_i(c)$ is the identity but with $c$ at the $(i,i)$ position instead of 1.
  • $E^2_{ij}(c)$ is the identity but with $c$ at the $(i, j)$ position instead of 0.
  • $E^3_{ij}$ swaps rows $i$ and $j$.

If $V$ is an elementary matrix then if $V=E_i^1(c)$ then let $S_{1,2}(V) = \text{sgn}(c)$, if $V=E_{ij}^2(c)$ then $S_{1,2}(V)=+1$, if $V=E_{ij}^3$ then $S_{1,2}(V)=-1$

If $V$ is invertible but not elementary my definitions for $S_1(V)$ and $S_2(V)$ differ.

For $S_1(V)$, if $V$ is invertible but not elementary then $V$ can be uniquely decomposed into elementary row matrices using a well-defined Gauss-Jordan elimination procedure:

$$ V = E_1\ldots E_K $$

In this case let

$$ S_1(V) = S_1(E_1)\ldots S_1(E_K) $$

For second candidate we define $S_2$ to have the property that

$$ S_2(AB) = S_2(A)S_2(B) $$

For $S_1$ it is clearly that $S_1(V)$ is a well-defined function for all matrices, but it is not clear to me how to prove $S_1(AB)=S_1(A)S_1(B)$.

For $S_2$ it is not clear to me how to prove such a function exists. For example, $V$ can be decomposed multiple different ways into elementary matrices.

$$ V = E_{1,1}\ldots E_{1,k_1} = E_{2,1}\ldots E_{2, k_2} $$

How to prove

$$ S_2(E_{1,1})\ldots S_2(E_{1,k_1}) = S_2(E_{2,1})\ldots S_2(E_{2, k_2})? $$

It is clear that if $S_2$ exists that $S_2(V) = S_1(V)$ and if we relax the constraint that we avoid explicit expersions for $\text{det}(V)$, then it is clear that all of the above can be proven and $S_1(V)=S_2(V) = \text{sgn}(\text{det}(V))$, but this is of course what I'm trying to avoid.

So my questions are:

  • How to show that $S_1(AB)$ satisfies $S_1(A)S_1(B)$?
  • How to show that $S_2(V)$ exists?
  • Is there another way to get at what I am seeking?
Theo Bendit
  • 53,568
Jagerber48
  • 1,631
  • There's a definition of the determinant (not just its sign) along the lines of what you're proposing. You can define it as a multilinear functions of the columns (or rows) of a matrix that maps $I$ to $1$, and is alternating (in the sense that swapping two of the arguments will negate the result). Such a function is provably unique, and thus must be the determinant. For elementary row matrix $E$ and square matrix $A$, you can use the properties to show $\det(EA) = \det(E)\det(A)$, which leads by induction to $\det(BA) = \det(B)\det(A)$ for invertible $B$. – Theo Bendit Dec 05 '21 at 01:56
  • @TheoBendit yes, I'm familiar with that definition of the determinant. The only proof I've seen for existence of the normalized, alternating, multilinear function of $N$ vectors relies on constructive definition of the function as the formulas or similar that I gave in the question that I'd like to avoid if possible. Maybe there is an alternative existence proof I haven't seen. – Jagerber48 Dec 05 '21 at 02:01
  • Another result that might be of interest to you: two bases have the same orientation if and only if each can be continuously deformed into the other (while remaining a basis at each intermediate stage). – blargoner Dec 05 '21 at 02:07
  • @blargoner yes thank you! That fact establishes and equivalence relation on invertible matrices that could be used to define $S_3(V)$ in relation to the standard basis (represented by matrix $I$). I think I'd want to see a proof that any basis can be deformed into either $I$ or $E_{12}^3$ defined above and ensure that that proof can be worked out without relying on an explicit formula for the determinant. I'll think more about this. – Jagerber48 Dec 05 '21 at 02:11
  • Here is a thread exploring the suggestion from @blargoner. https://math.stackexchange.com/questions/2199509/intuition-for-bases-having-the-same-orientation – Jagerber48 Dec 05 '21 at 03:29
  • In researching this question I came across some critical references. "Down with Determinants!" by Sheldon Axler https://www.maa.org/sites/default/files/pdf/awards/Axler-Ford-1996.pdf and the book by the same author: "Linear Algebra Done Right" https://www.axler.net/. I believe this book has the answers to my questions, and once I work it out I will post answers here. – Jagerber48 Dec 05 '21 at 07:00
  • Everything here is sensible, but the annoying question of how to prove that $S_1$ and $S_2$ agree for any elementary decomposition of a matrix is basically the reason that we prefer to define the determinant in a more invariant way, say by a sum over permutations, or by a particular alternating multilinear function. It's quite easy to use one of these definitions to show that everything you've said here works, but quite difficult to go the other way. There is a common theme in maths: use an abstract definition to prove that more "computational" methods work, then use those methods to compute. – Joppy Dec 06 '21 at 05:33

0 Answers0