7

Say $v_1,v_2,\dots,v_k\in\mathbb{Q}^n$. Let $V$ be the subspace spanned by these vectors and let $W\subseteq\mathbb{R}^n$ be the vector subspace in $\mathbb{R}^n$ spanned by these vectors. Is it true that $\dim_\mathbb{Q} V=\dim_\mathbb{R} W$?

The equality seems very obvious and it's in fact easy to prove it using induction on $n$ :

If $n=1$, then the equality holds as either both $V$ and $W$ are $0$ or $V=\mathbb{Q}$ and $W=\mathbb{R}$.

For $n>1$, if $\dim V<n$, then the theorem holds via induction. Hence I assume that $\dim V=n$ so $V=\mathbb{Q}^n$. But in this case $W$ also must be equal to $\mathbb{R}^n$ and hence the dimensions are equal.

Is there a more natural/intuitive way to see why this equality holds?

user26857
  • 53,190
Levent
  • 5,240
  • 9
    Write the vectors as a $k\times n$ matrix. Reduce to row-echelon form. Then we have a matrix whose row space (either over $\Bbb Q$ or $\Bbb R$) is the same as the original, and whose dimensions over each field are manifestly the same. – Angina Seng Aug 02 '20 at 08:47
  • 2
    @AnginaSeng, this ought to be The Answer as it deals with the general field extension in an entirely elementary way. (I reckon it's not emphasised enough in courses that the operations we use in Gaussian reduction only use scalars in the smallest field where the original matrix entries live.) – ancient mathematician Aug 02 '20 at 10:00

3 Answers3

4

WLOG, we may suppose the vectors are independent in $\mathbb{Q}^n$ (since if they aren't, we can just throw away vectors until we have a basis without changing $V$ or $W$).

What you're really proving is that if the vectors are independent in $\mathbb{Q}^n$, then they are independent in $\mathbb{R}^n$. For clearly, the converse follows.

The trick is to construct an orthogonal basis. This trick works in $\mathbb{Q}^n$ because the key step of the procedure is taking a vector $v$ and replacing it by $v' = v - \frac{w(v \cdot w)}{|w|^2}$. But if $v, w \in \mathbb{Q}^n$, then the replacement $v'$ is also in $\mathbb{Q}^n$.

Doctor Who
  • 3,471
  • Thank you for your answer. You definitely show what we want, but the theorem holds for any field extension so I'd be more happy with an argument that is more intuitive and basic. If there'll be no other answer, I'll accept yours. – Levent Aug 02 '20 at 09:45
  • The definition of $v'$ holds in any field extension. One can think of it as a column/row operation taking $v$ and subtracting a multiple of $w$. So it's a special case of Angina Seng's comment. – Chrystomath Aug 02 '20 at 14:34
3

if you collect the $\mathbf v_j$ in a matrix, you can make this a result about polynomials.

$\mathbf V:=\bigg[\begin{array}{c|c|c|c|c} \mathbf v_1 & \mathbf v_2 &\cdots & \mathbf v_{k}\end{array}\bigg]$

working over $\mathbb Q$ (or some subfield) we have: $\text{rank}\big(\mathbf V\big) = r$

Now use the fact that a matrix has rank $r$ iff it has some $r\times r$ submatrix with nonzero determinant and for $m\gt r$ all $m\times m$ submatrices have zero determinant.

These determinants don't change when you consider $\mathbb R$, or any extension, hence the rank doesn't change either.

user8675309
  • 12,193
1

Method $1$: (merely elaborating on the comment)

Let $K$ be a field, and $F\subseteq K$ a subfield (e.g. rationals and reals), and let $1\leq k \leq n$. Given $v_1, \dots, v_k \in F^n$, store them in a $k\times n$ matrix $B$. Now, we can regard $B$ as an element in $M_{k\times n}(F)$ or $M_{k\times n}(K)$, and your question is really asking if: \begin{align} \dim_F[\text{image}_F(B)] &\stackrel{?}{=} \dim_K[\text{image}_K(B)] \end{align} (hopefully it's clear with the subscripts of fields that your $V$ is just $\text{image}_F(B)$ etc.)

The answer is yes, because \begin{align} \dim_F[\text{image}_F(B)] &= \text{no. of non-zero rows in RREF of $B$ over $F$} \\ &= \text{no. of non-zero rows in RREF of $B$ over $K$} \\ &= \dim_K[\text{image}_K(B)] \end{align} The first (and third) equal signs are true because when we calculate the RREF of a matrix, all we do is perform elementary row operations (which amounts to multiplying $B$ by an invertible matrix; i.e on the level of transformations, we're composing by an appropriate isomorphism, and these of course preserve the dimension of all subspaces).

The second equal sign follows from the fact that the RREF of $B$ whether calculated over $F$ or $K$ is the exact same matrix of $0$'s and $1$'s. To see why, just look at the process for how one calculates the RREF; these operations of adding rows/scaling by multiples or whatever can all be regarded as performing arithmetic in the smaller field $F$.


The second approach is of course the one using determinants given in @user8675309's excellent answer. The determinant of a square matrix, has an explicit formula, which involves only the entries of the matrix. In this case all these entries come from the smaller field $F$. Therefore, the determinant of EVERY $s\times s$ submatrix of $B$ yields the same number (which will of course be an element of $F$), hence the rank of $B$ whether calculated over $F$ or over $K$ yields the same answer.

user26857
  • 53,190
user580918
  • 1,095