0

I'm stuck on proving the following theorem:

"Let $B = \{u_1,u_2, . . . ,u_m\}$ and $B' = \{v_1, v_2, . . . , v_k\}$ be bases for a non-zero subspace $S$ of $\mathbb R^n$. Then $m = k$. That is, any two bases for a subspace have the same cardinality."

I have thought about doing a proof by contradiction for the statement, doing cases where $m \gt k$ and $m \lt k$. Let us assume that B and B' are bases of S. Suppose that $m \gt k$ without loss of generality, then is it true that either $B$ is linearly dependent or $B'$ does not span S? If so, how would you prove it because I've tried so many different ways and I can't seem to find on that works.

It's not enough to prove by extending or reducing any of the sets since there is the possibility that $B \cap B' = \emptyset$

Any help would be much appreciated. Thank you in advance.

Tim
  • 847

2 Answers2

2

Assume that $k\leqslant m$. Since $B$ is a basis, $B\setminus\{u_1\}$ doesn't span $\mathbb R^n$ and therefore there is some $v_j$ which is not a linear combination of the elements of $B$. We can assume without loss of generality that $j=1$. So, $\{v_1,u_2,u_3,\ldots,u_m\}$ is another basis of $\mathbb R^n$. We can start again and deduce that $\{v_1,v_2,u_3,\ldots,u_m\}$ is a basis of $\mathbb R^n$ and so on. But then we deduce that $\{v_1,v_2,\ldots,v_k,u_{k+1},\ldots,u_m\}$ is a basis of $\mathbb R^n$. Since $B'$ is a basis of $\mathbb R^n$ too, this can only happen if $m=k$.

The case in which $k\geqslant m$ is similar.

2

As you said, we can use a proof by contradiction.

Assume $m \neq k$. Without loss of generality we can assume $m < k$. Since every vector in $S$ can be written as a linear combination of the vectors in the basis $B$, we can identify vectors in $S$ with vectors in $\mathbb{R}^m$ (where each component is the scaling factor of the corresponding vector in $B$). In particular, we can identify the vectors $\{v_1,v_2,\dots,v_k\}$ with column vectors $\{x_1,x_2,\dots,x_k\}$. Since $k > m$, these vectors are not linearly independent and so cannot form a basis.

The last fact can be seen as follows:

Suppose we have $k$ vectors in $\mathbb{R}^m$ and $k>m$, as above.

Consider the vector equation

$$\alpha_1 v_1 + \alpha_2 v_2 + \dots + \alpha_k v_k = \underline 0$$

The vectors are linearly independent if and only if the only solution to this equation is $\alpha_1 = \alpha_2 = \dots = \alpha_k = 0$.

The vector equation corresponds to $m$ simultaneous equations (because each vector has $m$ entries) and there are $k$ unknowns. Since $k>m$ we know that there are either no solutions or infinitely many. $\alpha_1 = \alpha_2 = \dots = \alpha_k = 0$ is always a solution, and so we must be in the case where there are infinitely many solutions. So the vectors are not linearly independent.

雨が好きな人
  • 2,555
  • 1
  • 10
  • 18