1

In a proof I'm reading, the author says "As $A$ is finite dimensional, a descending chain of left ideals must stabilize."

The context is that $A$ is a finite dimensional simple $k$-algebra i.e. it contains no non-trivial two sided ideals.

Could somebody explain why being finite dimensional means that $A$ contains a minimal left ideal? I feel like I'm missing something obvious.

Thanks for any replies.

Lammey
  • 1,433
  • Hint: Those left ideals are also vector spaces over $k$. And the maximal left ideal $A$ is finite dimensional to begin with. The dimensions of the ideals in the sequence thus form a descending sequence of... – Jyrki Lahtonen May 02 '14 at 15:50
  • @Jyrki So you're saying that if $\dots L_2 \subseteq L_1 \subseteq A$ are left ideals, $L_{i-1}\subset L_i$ (proper subset) if and only if $L_{i-1}$'s dimension as a $k$-vector space is less than $L_i$'s. Therefore either the chain stabilizes at one of the $L_i$'s or at $k$. Is this correct? I might have misinterpreted it! – Lammey May 02 '14 at 16:44
  • About right. I would rather say that the chain stabilizes at one of the $L_i$s or at $0$. After all, $k$ is not a left ideal. – Jyrki Lahtonen May 02 '14 at 17:32
  • Ah yeah ok yeah my bad! Thanks for the help – Lammey May 02 '14 at 17:55
  • Don't worry. We all have blind spots occasionally. – Jyrki Lahtonen May 02 '14 at 19:19
  • Related, but I guess not really a duplicate of this: https://math.stackexchange.com/q/3083920/29335 – rschwieb Jan 23 '19 at 15:41

0 Answers0