1

Recently I was discovering the fact that a lot of the basic results of linear algebra can be generalised to modules over a division ring, instead of modules over a field (i.e. vector spaces).

Because i don't now a good source that deals detailed with this topic and the excitement of figuring out the results on my own, i tried to develop basic results by myself.

From now on let $S$ be such a division ring, $V$ a $S$-module ($S$-module is supposed to mean left $S$-module, such as $S$-algebra is supposed to mean left $S$-algebra). I started with the interesting question of classification and figured out: Zorn's lemma can be used as like its done in linear algebra, means $V$ is free. Dimension is an invariant/unique. Being a maximal linear independent set, minimal generating set or a basis is all equivalent.

I continued with the question "Matrices = linear maps?" and figured out that this is also true (to be more precise $\text{End}_S(S^n) \cong S^{n\times n}$ as $S$-algebras), but you have to define matrix multiplication slightly different:

Let $f: S^n \to S^m$ $S$-linear and $e_i$ the standard basis vectors, the representative matrix $A_f$ is defined as $A_f:= (f(e_1),\dots, f(e_n))$. Note that this is actually a matrix, since $f(e_i) \in S^m$. The matrix-vector multiplication now has to be: $$A\cdot x= \begin{pmatrix} \sum_{j=1}^nx_ja_{j1} \\ \vdots \\ \sum_{j=1}^nx_ja_{jm} \end{pmatrix}$$ which can differ from the standart matrix multiplication, since $S$ does not have to be commutative. This leads to the matrix-matrix multiplication: $$A\in S^{m\times n}, B\in S^{k\times m}: B\cdot A := (B(Ae_1),\dots, B(Ae_n))=\begin{pmatrix} \sum_{j=1}^ma_{j1}b_{1j} & \cdots & \sum_{j=1}^ma_{jn}b_{1j} \\ \vdots && \vdots \\ \sum_{j=1}^ma_{j1}b_{kj} & \cdots & \sum_{j=1}^ma_{jk}b_{nj}\end{pmatrix}$$

I.e. the $(ij)$-th entry $$(BA)_{ij} = \sum_{l=1}^ma_{lj}b_{il}$$

Now i get the beautiful linear algebra results $$"f \in \text{Aut}_S(S^n) \iff A_f \text{ is invertible } \iff (Ae_1,\dots, Ae_n) \text{ is a basis/ lin. indep./ generating set}"$$ left inverse $=$ right inverse and so on.

Now the question:

The linked wikipedia article mentioned that left invertibility is unequal to right invertibility, such as the concrete (left/right) inverse matrices, in general. But somehow it seems to use the same idea I did, or at least isn't precise enough. While browsing for an answer i found different results on this topic, so I hope someone experienced can help me out, if I formulate exactly what I mean.

Did I make a mistake somewhere in this process (probably with matrix multiplication/ composition of $S$-linear functions? Because I don't find any. Or is it like my guess: They mean a different matrix multiplication, the canonical one, which usually does not give a $S$-linear map and things are getting a lot more complicated if we don't adapt the matrix multiplication to this purpose.

Any help, idea or source/book would be highly appreciated.

rschwieb
  • 160,592
Tina
  • 1,170
  • @rschwieb yes your right i forgot to mention that $S$-module is supposed to mean left $S$-module, such as $S$-algebra is supposed to mean left $S$-algebra – Tina Sep 25 '24 at 11:36
  • Great, thanks for elaborating. Incidentally, I'll mention there's not really any common definition of "left $S$-algebra." One could try "a ring $R$ such that $_SR$ is a left $S$ module" but a lot is lacking if the module action does not commute with the multiplication operation in $R$. – rschwieb Sep 25 '24 at 13:07
  • @rschwieb thank you for your kind answer! The definition i thought of was, similarly to F-algebras, a ring that is a left module and the multiplication is $S$-bilinear. And to be honest, i dont see why this doesnt work here... – Tina Sep 25 '24 at 16:15
  • For this reason, $S$ algebras are almost always (in my experience) assumed to be over a commutative ring. – rschwieb Sep 25 '24 at 17:11
  • @rschwieb I see now, thanks a lot. somewhere i must have missed something, when i "porved" that $\text{End}_S(V)$ is such an algebra, i.e. the composition is $S$-bilinear.

    By the way, i think what you wrote would be a good answer. I doesnt totally answer the question, but it helped me and its the most useful i got, on my process solving it.

    – Tina Sep 25 '24 at 18:07
  • ok, i moved them to a solution, and even expanded it a little. – rschwieb Sep 25 '24 at 20:23

1 Answers1

1

Here are a couple references although they are not much. Another one is mentioned in the comments (Gertrude Ehrlich's Fundamental Concepts of Abstract Algebra)

Regular matrix multiplication should be totally fine. For any ring $R$ (with identity) and $F=\oplus_{i=1}^nR$ we have $End(F_R)\cong M_n(R)$, using the matrices on the left of column vectors. Or, if you prefer, $End(_RF)\cong M_n(R)$ as matrices operating on the right side of row vectors. You see, you need to make a choice between modules as left modules or right modules, when commutativity is not present.

The definition i thought of [for an algebra over $S$] was, similarly to $F$-algebras, a ring that is a left module and the multiplication is $S$-bilinear. And to be honest, i dont see why this doesnt work here..

Let $r,s\in S$ and $a,b\in A$ the algebra. By "$S$ bilinear" i think you mean $r(a,b)=(ra,b)=(a,rb)$. Using this definition, you'd have $(ra,sb)=r(a,sb)=sr(a,b)$ and also $(ra,sb)=s(ra,b)=rs(a,b)$. One usually also wants the algebra to satisfy $s(1,1)=(s,1)$ by, for example, requiring $S$ to be a subring. If that's true, then we'd have $rs=sr$ for all $r,s\in S$.

There's also the question of "what is $((a,b) , s(c,d))$? You can't multiply $(a,b)$ on the right with $s$ because it's not defined. If you commute it, you'll wind up with the commuting you see above.

It is interesting to as what the balanced bilinear maps are (this means $(ar,b)=(a,rb)$ for all $r\in S$.) Given an $R,S$ bimodule $M$ and an $S,T$ bimodule $N$, the balanced bilinear maps on $M\times N$ are precisely what one wants to talk about when discussing the tensor product $_R (M\otimes_S N)_T$ as a bimodule. You could do this for $S,S$ bimodules $M$ and $N$ but you still would be lacking some of the commutativity afforded by the ordinary definition of algebras.

rschwieb
  • 160,592
  • I am having one more question: when you say "$s(1,0)=(s,0)"$ didn't you rather mean:$"s(1,1)=(s\cdot 1, 1)"$, because if you multiplie by $0$ the statement is trivially always true...? – Tina Sep 26 '24 at 07:22
  • @Tina Yes, you're right. I will fix it! – rschwieb Sep 26 '24 at 11:21