3

Consider a ground field $k$ of characteristic zero, and the category of super vector spaces $\mathrm{SVect}_k$ with the grading $V = V_0 \oplus V_1$ and homomorphisms $f: V \to W$ satisfying the conditions $f(V_i) \subseteq W_i$.

Equivalently, consider a $k$-linear involution $J: V \to V$, $J^2 = 1$, so that $V = V_0 \oplus V_1$ defined by $J|_{V_i} = (-1)^i$, and the requirement that $f: V \to W$ respects the grading becomes $f \circ J_V = J_W \circ f$. This correspondence yields the equivalence of categories $\mathrm{Mod}_{\mathfrak{C}} \simeq \mathrm{Rep}_k(\mathbb{Z}/2\mathbb{Z}) \simeq \mathrm{SVect}_k$, where $\mathfrak{C} = k[\mathbb{Z}/2\mathbb{Z}] \cong k[j](j^2 - 1) = \mathfrak{C}_0 \oplus \mathfrak{C}_1 \cong k \oplus k$ is the paracomplex algebra over $k$ (a semi-simple Artinian commutative ring), and $\mathfrak{C}_i$ are its two maximal ideals.

In particular, finitely generated modules ($\mathrm{FinMod}_\mathfrak{C} \simeq \mathrm{FinSVect}_k$) have a very nice "explanation" as $k^{m|n} \simeq \mathfrak{C}_0^m \oplus \mathfrak{C}_1^n$, which is pretty neat because it highlights the fact that paracomplex structures don't have to be even-dimensional as long as we're willing to ditch free modules.

It's a pretty nice equivalence, all in all: makes it really obvious that it's a fusion category, the free super vector space is just $FV = V \otimes_k \mathfrak{C}$, there's also $\{1, \Pi\} = \mathrm{Aut}_k(\mathfrak{C})$ and other cool stuff like that.

What I'm struggling with is understanding how the tensor product and the internal Hom work under this equivalence. I managed to calculate $U \otimes_{\mathfrak{C}} V = (U \otimes_k V) / (u_0 \otimes v_1 - u_1 \otimes v_0 \mid u \in U, v \in V)$. But I'm kind of stuck here, either I've made a mistake or I just lack the necessary commutative algebra skills to really understand what the tensor product actually looks like and why the braiding homomorphism is the way it is.

I could use some hints, direction, textbook references, anything. For context, I'm a hobbyist with rather spotty background in pure maths. Please help :)

1 Answers1

2

Let's work generally to start with because there are some features of this situation that are so special that they're confusing. First, if $A$ is just a $K$-algebra, $\text{Mod}(A)$ is not equipped with a tensor product at all. If $A$ is commutative, $\text{Mod}(A)$ is equipped with a tensor product $\otimes_A$ over $A$, but when $A = K[j]/(j^2 - 1)$ this does not produce the super tensor product; instead it gives the "pointwise" tensor product, whose even part is $V_0 \otimes W_0$ and whose odd part is $V_1 \otimes W_1$. This is not the one we want.

Instead the super tensor product can be thought of as coming from a Hopf algebra structure on $A$. Generally, if $G$ is a group, the group algebra $K[G]$ acquires a Hopf algebra structure with comultiplication given by

$$\Delta(g) = g \otimes g$$

and the comultiplication allows us to write down the tensor product of representations of $G$; if $V, W$ are two representations then the tensor product $V \otimes_K W$ over $K$ acquires a $G$-action where $g$ acts by $\Delta(g)$. This recipe works more generally for any Hopf algebra. There is also a structure on a Hopf algebra called the antipode, which here is $S(g) = g^{-1}$, and using the antipode produces an action on the space $[V, W]$ of linear maps $V \to W$ which is responsible for the existence of the internal hom.

So, $A = K[j]/(j^2 - 1)$ has a Hopf algebra structure given by $\Delta(j) = j \otimes j$ and $S(j) = j^{-1} = j$, and you can check that this reproduces the tensor product and internal hom of super vector spaces. Concretely this boils down to checking that the tensor product of the $1$-dimensional purely even and purely odd vector spaces $1, -1$ satisfies $1 \otimes 1 \cong 1, 1 \otimes (-1) \cong (-1)$, and so forth, and similarly for the homs.

This generalizes as follows: if $G$ is a finite abelian group and $A = K[G]$ is its group algebra over a field $K$ of characteristic not dividing $|G|$ and such that $K$ has all $|G|$-th roots of unity (for example, $K = \mathbb{C}$ always works), then the category of representations of $G$ over $K$, or equivalently of $K[G]$-modules, equipped with the tensor product of representations as above, is equivalent, as a monoidal category, to the category of $\hat{G}$-graded vector spaces, where $\hat{G} = \text{Hom}(G, K^{\times})$ is the Pontryagin dual of $G$, or equivalently the group of $1$-dimensional representations of $G$. This is a group non-canonically isomorphic to $G$ itself. For example, here $G = \mathbb{Z}/2$ has two $1$-dimensional representations over a field $K$ of characteristic $\neq 2$, namely the trivial representation $1$ and the sign representation $-1$, and these correspond to even vs. odd super vector spaces respectively.

However, now we need to slow down and notice something very important:

Nothing we've said so far reproduces the braiding!

A Hopf algebra $H$ is cocommutative if the comultiplication $\Delta : H \to H \otimes H$ is invariant under switching the two factors of $H$. This is always true for group algebras, and when it's true the monoidal structure defined above is naturally symmetric monoidal. However, this does not produce the correct braiding on super vector spaces; it produces the "trivial" braiding, with no signs. All of the interesting features of super vector spaces have to do with the braiding.

There is a general theory of what extra structure you have to give a Hopf algebra to equip its monoidal category of representations with a braiding; see quasitriangular Hopf algebra for more on this. However, I am not convinced this is a good way to think about the super braiding. Personally the way I think about the super braiding is that it's an approximation to the braiding on spectra (see the nLab for more on this), which is "God-given" in some sense and has nothing to do with a Hopf algebra of any kind (as far as I know). Of course this won't be everyone's cup of tea but I think it's the "correct explanation" for why the super braiding shows up everywhere, e.g. in cohomology, differential forms, etc.

Said another way, the way you construct the algebra $A = K[j]/(j^2 - 1)$ abstractly from $\text{SVect}$ is by considering the forgetful functor $F : \text{SVect} \to \text{Vect}$ sending a super vector space $(V_0, V_1)$ to $V_0 \oplus V_1$. Then $A = \text{End}(F)$ is the endomorphism algebra of this functor. This is potentially useful for getting a basic handle on how $\text{SVect}$ works, at least as a monoidal category, but note that the functor $F$ is monoidal but not symmetric monoidal (because of the signs in the braiding). So this functor $F$ is not capturing the most interesting feature of the situation! Without discussing $F$ or $A$ at all you can define $\text{SVect}$ as a category as just the direct sum $\text{Vect} \oplus \text{Vect}$ of two copies of $\text{Vect}$ (even and odd). This tells you exactly how $\text{SVect}$ works as a category without any complications.

Qiaochu Yuan
  • 468,795
  • Thanks! I have a follow-up question about the tensor product: I checked that the action $\mathfrak{C} \stackrel{\Delta}{\longrightarrow} \mathfrak{C} \otimes \mathfrak{C} \to U \otimes V$ given by $j(u \otimes v) = (ju) \otimes (jv)$ reproduces the twisted tensor product. What I don't quite understand is how it's compatible with $k$-linearity: am I correct in assuming that $bj(u \otimes v) = (bj,u) \otimes (jv)$, for example? – Aleksei Averchenko Jul 24 '24 at 17:28
  • 1
    @Aleksei: yes, scalar multiplication commutes with everything in sight. It's also equal to $(ju) \otimes (bjv)$ because we are taking the tensor product over $K$. (My personal preference is to use the capital $K$ for fields because I often want to use lowercase $k$ for an index.) – Qiaochu Yuan Jul 24 '24 at 17:55