3

This question is from Hungerford's Algebra.

Let $R$ be a ring with identity and $S$ the ring of all $n\times n$ matrices over $R$. $J$ is an ideal of $S$ iff $J$ is the ring of all $n\times n$ matrices over $I$ for some ideal $I$ in $R$.[Hint: Given $J$ let $I$ be the set of all those elements of $R$ that appear as the row $1-$ column $1$ entry of some matrix in $J$. Use the matrices $E_{r,s}$ where $1\le r\le n, 1\le s \le n$ and $E_{r,s} $ has $1_R$ as the row $r-$ column $s$ entry and $0$ elsewhere. Observe that for a matrix $A=(a_{ij}), E_{p,r}AE_{s,q}$ is the matrix with $a_{r,s}$ in the row $p-$ column $q$ entry and $0$ elsewhere.

Here's my attempt:

Let $I=\{x\in R \ | \ a_{11}=x \ \text{for some} \ (a_{ij})=A\in J \} $. Suppose $a,b \in I$. Then $a=a_{11}$ for some $A \in J$ and $b=b_{11}$ for some $B\in J$. Then $A-B \in J $ since $J$ is an ideal so $(A-B)_{11}=a-b\in I.$

I'm having trouble proving the other condition.

Let $r \in R$ and $a \in I$. Let $R'$ denote the matrix in $S$ that has $r$ as the row $1-$ column $1$ entry and $0$ everywhere else and $A$ denote the matrix in $J$ for which $a=a_{11}$. Then $R'A$ has $ra$ as row $1-$ column $1$ entry and it must be in $J$ since $J$ is an ideal so $ra \in I$.

Is this correct? It can't be because I haven't used the fact that $R$ has an identity. Also, where do I use the hint?

Any hints, ideas are greatly appreciated.

user264745
  • 4,595

2 Answers2

1

There are really two things to prove:

(1) Given a ideal $I \subseteq R$, the subset $M_n(I) \subseteq M_n(R)$ is also a ideal.

(2) Given a ideal $J \subseteq S = M_n(R)$, then $J = M_n(I)$ for some ideal $I \subseteq R$.

In these statements, ideal means two-sided ideal.

There is really a map $$ \begin{align} \{ \text{ideals of } R \} \quad & \overset{F}{\to} \quad \{ \text{ideals of } S \} \\ I \quad &\mapsto \quad M_n(I) = \{ (a_{ij}) \mid a_{ij} \in I \text{ for all } i,j \}. \end{align} $$ Proving (1) is equivalent to showing that this map is well-defined, while proving (2) is equivalent to showing that this map onto. You actually have to show both of these, but (1) is more-or-less a straightforward application of the definition of ideals.

For (2), you have the right idea, but it's incomplete. First of all, the inclusion $$ \{ a_{11} \mid (a_{ij}) \in J \} \hookrightarrow \{ a_{ij} \mid (a_{ij}) \in J \} $$ is actually a bijection, provided that $J$ is an ideal. With matrix units $\{E_{ij}\} \in S$ and any matrix $A = (a_{ij}) \in J$, $$ E_{1i} A E_{j1} = a_{ij}E_{11}, $$ which is a matrix of all zeroes except for the $(1, 1)$-entry, which has the value $a_{ij}$. (This is a special case of Hungerford's hint.) The consequence is that if $x$ is any entry of a matrix in $J$, then it is also specifically the $(1, 1)$-entry of a (possibly different) matrix in $J$.

You lose nothing by only looking at the corner entry! (I must emphasize that this is demonstrably false if $J$ is not a $2$-sided ideal so the matrix unit "trick" is not available.) And you gain a more elegant way to find preimages of the map $F$ above.

Now, given an ideal $J \subseteq S$, define $$ I = \{ a_{11} \mid (a_{ij}) \in J \}. $$ You must show that this is an ideal. You have correctly argued that it's closed under addition, so it's an abelian group. To show that it's closed under scalars from $R$, let $x \in I$ and $r,r' \in R$.

In order to show that $rxr' \in I$, let $(a_{ij}) \in J$ such that $x = a_{11}$. Let $(\delta_{ij})$ denote the $n \times n$ identity matrix in $S$. Then, the scalar multiple $r(\delta_{ij}) = (r\delta_{ij})$ has $(1, 1)$-entry equal to $r$, and analogously, $(r'\delta_{ij})$ has $(1, 1)$-entry $r'$. Furthermore, $$ (r a_{ij} r') = (r\delta_{ij}) (a_{ij}) (r'\delta_{ij}) \in SJS = J, $$ and so it's $(1, 1)$-entry, namely $r x r'$, is in $I$.

Sammy Black
  • 28,409
0

Proof: $(\Rightarrow)$ Suppose $J$ is an ideal of $S$. Define map $$\begin{align} \pi_{ij}: J &\longrightarrow R\\(a_{ij})&\longmapsto a_{ij} \end{align}.$$ Assume $I=\bigcup_{1\leq i,j\leq n}\pi_{ij}(J)$. let $a,b\in I$. Then $a\in \pi_{pq}(J)$ and $b\in \pi_{rs}(J)$. So $\exists X,Y\in J$ such that $\pi_{pq}(X)=a$ and $\pi_{rs}(Y)=b$.

Let $U$ be a matrix obtained by interchanging $p$th and $r$th row of identity matrix $I_n$. Let $V$ be a matrix obtained by interchanging $q$th and $s$th row of identity matrix $I_n$. Observe if $A\in S$, then $UA$ is a matrix obtained by interchanging $p$th and $r$th row of $A$. Similarly, if $A\in S$, then $AV$ is a matrix obtained by interchanging $q$th and $s$th column of $A$. Since $J$ is an ideal in $S$, we have $UXV\in J$ and $UXV+Y\in J$. Clearly $\pi_{rs}(UXV)=a$. Thus $a+b= \pi_{rs}(UXV+Y)\in \pi_{rs}(J)\subseteq I$. So $a+b\in I$.

Since $J$ is an ideal in $S$, we have $(-I_n)Y\in J$. Clearly $\pi_{rs}[(-I_n)Y]=-1_Rb=-b$. Thus $-b\in I$. It is easy to check $r’a,ar’\in I$ where $r’\in R$. Therefore $I$ is an ideal of $R$.

Now we show $M_n(I)=J$. Inclusion $J\subseteq M_n(I)$ is trivial. Let $A=(a_{ij})\in M_n(I)$. Then $\forall i,j\in \{1,…,n\}$, $\exists X_{ij}\in J$ such that $\pi_{ij}(X_{ij})=a_{ij}$. Let $E_{ii}$ be a matrix with $(i,i)$ entry $1_R$ and zero everywhere else. Note $E_{ii}X_{ij}$ is a matrix such that $i$th row of $E_{ii}X_{ij}$ is $i$th row of $X_{ij}$ and every other row is zero. Note $(E_{ii}X_{ij})E_{jj}$ is a matrix with $(i,j)$ entry $a_{ij}$ and zero everywhere else, i.e. equal to $a_{ij}E_{ij}$. Since $J$ is an ideal, we have $E_{ii}X_{ij}E_{jj}\in J$. Clearly $$\sum_{1\leq i,j\leq n} E_{ii}X_{ij}E_{jj}=A\in J.$$ Thus $M_n(I)\subseteq J$. Hence $M_n(I)=J$. $(\Leftarrow)$ It is trivial.


When we take $I=\bigcup_{1\leq i,j\leq n}\pi_{ij}(J)$ instead of $\{a_{11}\mid (a_{ij}\in J\}$, we get $J\subseteq M_n(I)$ for free, but the trade off is that $I$ is closed under addition becomes relatively hard. I introduce matrix $U$ and $V$ because these matrix are used plenty of times in building the notion of determinant. We don’t really need $E_{pr}AE_{sq}=a_{rs}E_{pq}$ general fact given in hint.

user264745
  • 4,595