Recently I was discovering the fact that a lot of the basic results of linear algebra can be generalised to modules over a division ring, instead of modules over a field (i.e. vector spaces).
Because i don't now a good source that deals detailed with this topic and the excitement of figuring out the results on my own, i tried to develop basic results by myself.
From now on let $S$ be such a division ring, $V$ a $S$-module ($S$-module is supposed to mean left $S$-module, such as $S$-algebra is supposed to mean left $S$-algebra). I started with the interesting question of classification and figured out: Zorn's lemma can be used as like its done in linear algebra, means $V$ is free. Dimension is an invariant/unique. Being a maximal linear independent set, minimal generating set or a basis is all equivalent.
I continued with the question "Matrices = linear maps?" and figured out that this is also true (to be more precise $\text{End}_S(S^n) \cong S^{n\times n}$ as $S$-algebras), but you have to define matrix multiplication slightly different:
Let $f: S^n \to S^m$ $S$-linear and $e_i$ the standard basis vectors, the representative matrix $A_f$ is defined as $A_f:= (f(e_1),\dots, f(e_n))$. Note that this is actually a matrix, since $f(e_i) \in S^m$. The matrix-vector multiplication now has to be: $$A\cdot x= \begin{pmatrix} \sum_{j=1}^nx_ja_{j1} \\ \vdots \\ \sum_{j=1}^nx_ja_{jm} \end{pmatrix}$$ which can differ from the standart matrix multiplication, since $S$ does not have to be commutative. This leads to the matrix-matrix multiplication: $$A\in S^{m\times n}, B\in S^{k\times m}: B\cdot A := (B(Ae_1),\dots, B(Ae_n))=\begin{pmatrix} \sum_{j=1}^ma_{j1}b_{1j} & \cdots & \sum_{j=1}^ma_{jn}b_{1j} \\ \vdots && \vdots \\ \sum_{j=1}^ma_{j1}b_{kj} & \cdots & \sum_{j=1}^ma_{jk}b_{nj}\end{pmatrix}$$
I.e. the $(ij)$-th entry $$(BA)_{ij} = \sum_{l=1}^ma_{lj}b_{il}$$
Now i get the beautiful linear algebra results $$"f \in \text{Aut}_S(S^n) \iff A_f \text{ is invertible } \iff (Ae_1,\dots, Ae_n) \text{ is a basis/ lin. indep./ generating set}"$$ left inverse $=$ right inverse and so on.
Now the question:
The linked wikipedia article mentioned that left invertibility is unequal to right invertibility, such as the concrete (left/right) inverse matrices, in general. But somehow it seems to use the same idea I did, or at least isn't precise enough. While browsing for an answer i found different results on this topic, so I hope someone experienced can help me out, if I formulate exactly what I mean.
Did I make a mistake somewhere in this process (probably with matrix multiplication/ composition of $S$-linear functions? Because I don't find any. Or is it like my guess: They mean a different matrix multiplication, the canonical one, which usually does not give a $S$-linear map and things are getting a lot more complicated if we don't adapt the matrix multiplication to this purpose.
Any help, idea or source/book would be highly appreciated.
By the way, i think what you wrote would be a good answer. I doesnt totally answer the question, but it helped me and its the most useful i got, on my process solving it.
– Tina Sep 25 '24 at 18:07