3

I'm thinking about the isomorphism theorems in linear algebra, and I can't seem to think of a result which relies on them. What are some applications of these results? I don't quite have an understanding of why they're important.


Theorem (The first isomorphism theorem) Let $\tau: V \rightarrow W$ be a linear transformation. Then the linear transformation $\tau^{\prime}: V / \operatorname{ker}(\tau) \rightarrow W$ defined by $$ \tau^{\prime}(v+\operatorname{ker}(\tau))=\tau(v) $$ is injective and $$ \frac{V}{\operatorname{ker}(\tau)} \approx \operatorname{im}(\tau) $$

Theorem (The second isomorphism theorem) Let $V$ be a vector space and let $S$ and $T$ be subspaces of $V$. Then $$ \frac{S+T}{T} \approx \frac{S}{S \cap T} $$

random0620
  • 3,325
  • 3
    @MarianoSuárez-Álvarez "a bit over the head of anyone who's encountered the for the first time" does not answer the question and is a bit condescending – random0620 Jan 22 '24 at 01:16

3 Answers3

5

The first isomorphism theorem implies the rank-nullity theorem which gives a super nice corollary about linear transformations between finite-dimensional spaces of equal dimension:

Theorem.

Let $V,W$ be vector spaces of equal finite-dimension over a field $\Bbb{F}$. For any linear transformation $T:V\to W$, the following statement are equivalent:

  • $T$ is injective
  • $T$ is surjective
  • $T$ is bijective (and hence a linear isomorphism).

This is a fundamental fact about linear transformations which is not true if you deal with other types of functions. This can be phrased in terms of matrices as well (and can be phrased in a variety of equivalent ways).

One of the nice consequences is that to check invertibility, you only need a one-sided inverse; the other follows immediately. So, for example, if $A,B$ are $n\times n$ matrices, then $AB=I$ automatically implies $BA=I$ and hence $A=B^{-1}$ and $B=A^{-1}$. Note very carefully that in general, for two functions, $f\circ g=\text{id}$ does not imply $g\circ f=\text{id}$.

This is of course a simple consequence, but it is used repeatedly.

Also, the rank-nullity theorem itself is super important in linear algebra: it precisely and concisely formalizes the intuitive notion that if you have $m$ equations in $n$ unknowns and only $r$ of those equations are ‘independent’, then you can eliminate $n-r$ many “free” independent variables. Or more geometrically, the solution set of $m$ many linear equations of which $r$ are independent gives an $(n-r)$-dimensional solution space. As a super concrete example, a scalar equation \begin{align} a_1x_1+\dots +a_nx_n&=0, \end{align} where not all the $a_i$’s are $0$, has an $(n-1)$-hyperplane as its solution set.

Of course, there are tons of other examples of the utility of the first isomorphism theorem, but this I think is the ‘most immediately important’ consequence.

peek-a-boo
  • 65,833
  • 1
    granted, one doesn’t need to use the first isomorphism theorem to prove rank-nullty, but still, I think that’s a nice ‘conceptual’ proof. – peek-a-boo Jan 22 '24 at 04:30
  • Are you proving rank-nullity by multiplying both sides by $\operatorname{ker}(\tau)$? – Vivaan Daga Jan 22 '24 at 08:41
  • 1
    @MarianoSuárez-Álvarez Of course I mean taking the direct product. The issue is for that you need that vector spaces split. So really what you need to do is use that $V/H$ has dimension $V$ minus dimension $H$. – Vivaan Daga Jan 22 '24 at 09:54
  • @VivaanDaga Yea essentially. Take a basis for the subspace, then extend to the whole space, then see that the projection of the last few form a basis for the quotient – peek-a-boo Jan 22 '24 at 18:12
  • 1
    Similarly, the second isomorphism theorem gives an inclusion-exclusion type equation $\dim(S+T) = \dim(S) + \dim(T) - \dim(S \cap T)$. As a simple example, that makes it easy to prove that two 2-dimensional subspaces $S, T$ of $\mathbb{R}^3$ must have nontrivial intersection, since $\dim(S+T) \le 3$ so $\dim(S \cap T) \ge 1$. – Daniel Schepler Jan 22 '24 at 20:13
3

A simple but instructive application is just that the isomorphism theorems can save you some tedious work (this is true for the isomorphism theorems in all contexts, not just in the vector space setting). Suppose you are given a vector space $V$ with a subspace $W$, and you wish to describe $V / W$ in more familiar terms.

Often you can make an educated guess about a vector space $Z$ such that $V / W \cong Z$. But proving that the isomorphism holds may be tedious as you have to set up an explicit isomorphism $V / W \to Z$ and verify that it is well defined, i.e. not dependent on choice of coset representative.

Instead, it's typically easier to guess a surjective linear map $\phi \colon V \to Z$ such that $\ker \phi = W$, then the first isomorphism theorem does the work for you.

To illustrate, let $V = \mathbb{R}^2$, $W = \{(x,0) \mid x \in \mathbb{R}\}$. We might guess $V / W \cong \mathbb{R}$. To verify that, define $\phi \colon V \to \mathbb{R}$ by $\phi(x,y) = y$. It's clear that $\phi$ is linear and surjective, and $\ker \phi = W$. So without further ado, the first isomorphism theorem assures us that $V/W \cong \mathbb{R}$ without any tedious checking of well-definedness.

That's a very simple example, but you can play around with others to convince yourself that this kind of use of the isomorphism theorems can save you a lot of effort.

David M
  • 1,204
  • Basically, the first isomorphism theorem allows you to prove many results without needing to check if the objects you’re working with satisfies the conclusions of the first isomorphism theorem. – Divide1918 Jan 22 '24 at 05:17
2

First one I can think of is this :

Let $A$ and $B$ be two linear subspaces of $E$, such that $A \oplus B = E$. Then : $$ E/A \simeq B$$

Adam
  • 88
  • https://math.stackexchange.com/questions/1696267/intuition-about-the-first-isomorphism-theorem?rq=1 and https://math.stackexchange.com/questions/1738334/intuition-about-the-second-isomorphism-theorem?rq=1 might be helpful to you. – Adam Jan 21 '24 at 23:45
  • Thanks this is a great example! – random0620 Jan 22 '24 at 01:16