4

So, I was reviewing the first course in Linear Algebra which I took and got curious about the reason behind defining the inverse of a matrix in the following way (from Wikipedia):

In linear algebra, an $n$-by-$n$ square matrix $A$ is called invertible (also nonsingular or nondegenerate) if there exists an $n$-by-$n$ square matrix $B$ such that $$ AB=BA=I $$

Now, I had an exercise to prove that if $AB=I$, then $BA=I$. Then, what is the reason to put both the equalities in the definition? Is that somewhat traditional or is it because of some specific reason which I'm not aware of?

I'd be happy if someone could help me out.

Thanks in advance!

  • 3
    It is a priori possible that only a one-sided inverse exists, or that the inverse is somehow restricted. – Integrand Aug 04 '20 at 13:01
  • 1
    It is not bad to mention the important property that the matrices commute in this case (which is not true in general). Of course, "$AB=I$" would be already sufficient. – Peter Aug 04 '20 at 13:06
  • 1
    For square matrices, existence of right(left) inverse implies existence of left(right) inverse and the two inverses are the same. – Vincenzo Tibullo Aug 04 '20 at 13:08
  • 5
    There are rings where left inverses may not be right inverses. But in matrix rings over a commutative ring, they do happen to be. – Angina Seng Aug 04 '20 at 13:30

2 Answers2

5

We want to require $AB=I$ and $BA=I$ for any linear operators $A$ and $B$. The second is redundant for finite-dimensional spaces but not in general.

Say $V$ is the space of all one-sided sequences $x=(x_1,x_2,\dots)$. Define $A,B:V\to V$ by $$Ax=(x_2,x_3,\dots),$$ $$Bx=(0,x_1,x_2,\dots).$$Then $AB=I$ but $BA\ne I$.

So: We need the condition for the infinite-dimensional case, so the reason it's included in the definition in the finite-dimensional case is so the definition is the same in every vector space.

quid
  • 42,835
  • I have had only a basic introduction to linear algebra, and not very comfortable to understand your answer. For instance, I don't know which linear operators you are referring in your first sentence. It would be helpful if you could attach some readings for me to better appreciate your answer. Thanks! – Abhinav Dhawan Aug 04 '20 at 13:48
  • @AbhinavDhawan "reading" would be a book on general linear algebra. Or you could just tell us what part of the answer you don't follow... – David C. Ullrich Aug 04 '20 at 13:56
  • I understood the answer and it solves my problem. I have some minor queries. First, can we write A and B explicitly as a matrix (obviously with infinite elements), when we are dealing with infinite dimensional vector spaces? In this case I can find A and B as matrices easily, but is this general for any linear operator? Second, in the example you took, A and B are not square matrices (they have infinite rows & 1 column). I am not sure how we define square matrices in infinite dimensions (maybe same cardinality of column set and row set works). So, for square matrices, will AB=I =>BA=I? – Abhinav Dhawan Aug 05 '20 at 01:35
  • @AbhinavDhawan They may be useful just to give you a sense of security if you're only used to the finite-dimensional case, but "officially" you should forget about infinitely large matrices; they're not going to work out nearly as well as you'd hope, and no, they can't be used to represent an arbitrary linear map. – David C. Ullrich Aug 05 '20 at 13:19
  • Okay, thank. This cleared a lot! – Abhinav Dhawan Aug 05 '20 at 13:48
4

Even for groups, I have seen the definition of an inverse as $ab=ba=e$, so defining both left and right inverse. This is quite convenient and not every definition has to be "minimal".

Of course, left inverse implies also right inverse, so it would be enough to require only, say, left inverse - see for example here:

Any set with Associativity, Left Identity, Left Inverse, is a Group. - Fraleigh p.49 4.38

Right identity and Right inverse implies a group

In particular, this holds for the group $G=GL_n(K)$.

Dietrich Burde
  • 140,055