0

Notation.

  • $\mathcal{L}(V,W)$ is the set of linear maps from a vector space $V$ to a vector space $W$.
  • $\mathbf{F}^{m,n}$ is the set of $m$-by-$n$ matrices whose entries are either real or complex numbers.

Result 3.72.

Suppose $V$ and $W$ are finite-dimensional. Then $\mathcal{L}(V,W)$ is finite-dimensional and $$\text{dim} \ \mathcal{L}(V,W) = (\text{dim} \ V)(\text{dim} \ W)$$


Source.

Linear Algebra Done Right, Sheldon Axler, 4th edition.


Question.

It's relatively easy to show $\text{dim} \ \mathcal{L}(V,W) = (\text{dim} \ V)(\text{dim} \ W)$ by using a previous result that there's an isomorphism between $\text{dim} \ \mathcal{L}(V,W)$ and $\mathbf{F}^{m,n}$, where $\text{dim} \ \mathbf{F}^{m,n} = mn$. But I don't think that shows $\mathcal{L}(V,W)$ is finite-dimensional.

Is the isomorphism between $\text{dim} \ \mathcal{L}(V,W)$ and $\mathbf{F}^{m,n}$ enough to justify that $\mathcal{L}(V,W)$ is finite-dimensional?

Or do I need to show it's finite-dimensional by finding a list of linear maps that span $\mathcal{L}(V,W)$? I have a feeling this is not so easy to do. Although, I have found a standard basis of $\mathbf{F}^{m,n}$ before; is the strategy to define linear maps in $\mathcal{L}(V,W)$ on a basis of $V$ to "map" it to the standard basis of $\mathbf{F}^{m,n}$?

Paul Ash
  • 1,828
  • If $V_1$ and $V_2$ are isomorphic and $V_1$ has dimension $n$ how many linearly independent vectors can $V_2$ have ? – Kavi Rama Murthy Jan 29 '24 at 23:11
  • Vector spaces are isomorphic if and only if they have the same dimension; see here and here – J. W. Tanner Jan 29 '24 at 23:12
  • @J.W.Tanner the way that result is stated in Axler's text is: "Two finite-dimensional vector spaces over $\mathbf{F}$ are isomorphic if and only if they have the same dimension." It already assumes they are both finite-dimensional. – Paul Ash Jan 29 '24 at 23:17
  • @geetha290krm I'm assuming at most $n$, but that's what I want to show, isn't it? – Paul Ash Jan 29 '24 at 23:19
  • 2
    In Exercise 3.7, Axler asks you to show that if $v_1,\ldots,v_n$ span $V$ and $T\colon V\to W$ is linear and surjective, then $T(v_1),\ldots,T(v_n)$ span $W$. Axler defines "finite dimensional" as "there is a [finite] list of vectors that spans the space" (Chapter 2, Section "Span and Linear Independence.") These two show that if there is an isomorphism between $V$ and $W$, and at least one of them is finite dimensional, then they both are. – Arturo Magidin Jan 29 '24 at 23:31
  • @ArturoMagidin that sounds familiar to me, but I can't find that exercise in the book; there's a few sections in chapter 3. That being said, what you wrote makes sense to me, so thank you! – Paul Ash Jan 29 '24 at 23:44
  • I'm looking at the second edition, so your mileage may vary for a different edition. – Arturo Magidin Jan 29 '24 at 23:46

1 Answers1

0

Yes, isomorphic vector spaces have the same dimension (finite or infinite; for infinite, assume all relevant spaces have bases, or assume the Axiom of Choice).

Theorem. Let $V$ and $W$ be vector spaces over $\mathbf{F}$. Let $T\colon V\to W$ be a linear transformation.

  1. If $T$ is one-to-one, and $\gamma\subseteq V$ is linearly independent, then $T(\gamma)$ is linearly independent.
  2. If $T$ is surjective, and $\gamma$ spans $V$, then $T(\gamma)$ spans $W$.
  3. As a consequence, if $T$ is an isomorphism, then $\beta\subseteq V$ is a basis for $V$ if and only if $T(\beta)$ is basis for $W$. In particular, $\dim(V)=\dim(W)$.

Note. In the second edition of Axler's book, Item 1 appears as Exercise 3.5 in Axler, restricted to finite $\gamma$, but not to finite dimensional spaces. Item 2 appears as Exercise 3.7, again restricted to finite $\gamma$ but with no a priori assumption of finite dimensionality.

Proof. Let $x_1,\ldots,x_n\in \gamma$ be pairwise distinct, and let $\alpha_1,\ldots,\alpha_n$ be such that $$\alpha_1T(x_1)+\cdots + \alpha_nT(x_n) = \mathbf{0}_W.$$ Since $T$ is linear, this means that $$\mathbf{0}_W = \alpha_1T(x_1)+\cdots + \alpha_nT(x_n) = T(\alpha_1x_1+\cdots + \alpha_nx_n).$$ Since $T$ is one-to-one, this means that $$\alpha_1x_1+\cdots + \alpha_nx_n = \mathbf{0}_V.$$ Since $\gamma$ is linearly independent, this implies that $\alpha_1=\cdots=\alpha_n=0$, as desired. Thus, $T(\gamma)$ is linearly independent. This proves 1.

Let $w\in W$. Since $T$ is surjective, there exists $v\in V$ such that $T(v)=w$. Since $\gamma$ spans $V$, there exist $v_1,\ldots,v_n\in\gamma$ and scalars $\alpha_1,\ldots,\alpha_n$ such that $v=\alpha_1v_1+\cdots+\alpha_nv_n$. Then $$w = T(v) = T(\alpha_1v_1+\cdots+\alpha_nv_n) = \alpha_1T(v_1)+\cdots+\alpha_nT(v_n)\in\mathrm{span}(T(\gamma)).$$ Thus, $W\subseteq\mathrm{span}(T(\gamma))$, proving that $T(\gamma)$ spans $W$. This proves 2.

If $\beta$ is a basis of $V$, then by $1$ we have that $T(\beta)$ is linearly independent and by $2$ we have that $T(\beta)$ spans $W$; therefore, $T(\beta)$ is a basis for $W.$ For the converse, apply the result to $T(\beta)$ and the isomorphism $T^{-1}$. Since $T$ is bijective, $|\beta|=|T(\beta)|$, so $$\dim(V) = |\beta| = |T(\beta)| = \dim(W).$$ This proves 3 and the theorem. $\Box$

In fact, here is a nice exercise:

Let $T\colon V\to W$ be a linear transformation.

  1. $T$ is one-to-one if and only if for every $S\subseteq V$, if $S$ is linearly independent, then $T(S)$ is linearly independent.
  2. $T$ is surjective if and only if for every $S\subseteq V$, if $S$ spans $V$, then $T(S)$ spans $W$.
  3. $T$ is bijective if and only if for every $S\subseteq V$, $S$ is a basis for $V$ if and only if $T(S)$ is a basis for $W$.

Finding an explicit basis for $\mathcal{L}(V,W)$ when $V$ and $W$ are finite dimensional is actually not hard to do, when you think about the important result that says that if $v_1,\ldots,v_n$ is a basis for $V$, and $w_1,\ldots,w_n$ are any vectors of $W$, then there is exactly one linear transformation $T\colon V\to W$ such that $T(v_i)=w_i$.

Using this result, fix a basis $\beta=(v_1,\ldots,v_n)$ for $V$, and a basis $\gamma=(w_1,\ldots,w_m)$ for $W$. Let $\delta_{ij}\colon V\to W$ be the unique linear transformation such that $$\delta_{ij}(v_k) = \left\{\begin{array}{lcl} w_j &&\text{if }k=i;\\ 0&&\text{otherwise.} \end{array}\right.$$

A linear transformation $T\colon V\to W$ is completely determined by what it does to $v_1,\ldots,v_n$. And each $T(v_i)$ can be expressed as a vector in $\gamma$-coordinates, $[T(v_i)]_{\gamma}=(\alpha_{i1},\ldots,\alpha_{im})$. Then it is straightforward to verify that $$T = \sum_{i=1}^n\sum_{j=1}^m \alpha_{ij}\delta_{ij}$$ by comparing what the sum does to each basis vector $v_i$. Thus, the linear transformations $\delta_{ij}$ span $\mathcal{L}(V,W)$. It is now also easy to verify that they are linearly independent, since if a sum $$\sum_{i=1}^n\sum_{j=1}^m \alpha_{ij}\delta_{ij}$$ is the zero linear transformation, then evaluating it at a $v_{i_0}$ gives that $$0 = \alpha_{i_01}w_1+\cdots +\alpha_{i_0m}w_m,$$ and by the linear independence of $(w_1,\ldots,w_m)$ we conclude that $\alpha_{i_01}=\cdots=\alpha_{i_0m}=0$. This holds for each value of $i$, so all the coefficients are zero, as desired.

In short, a basis for $\mathcal{L}(V,W)$ when $V$ and $W$ are finite dimensional (in fact, it suffices for $V$ to be finite dimensional when $W$ has a basis) is obtained by fixing a basis $\beta$ for $V$, a basis $\gamma$ for $W$, and taking for each $v\in \beta$ and $w\in \gamma$ the linear transformation that sends $v$ to $w$ and all other vectors in $\beta$ to $0$.

Arturo Magidin
  • 417,286