5

Does a basis for an $n$-dimensional vector space have to have $n$ vectors? For example, if I form a basis for $\mathbb{R}^n$, do I need at least $n$ vectors in my basis set?

In other words, can I form a basis for $\mathbb{R}^n$ by using only $n-1$, or less, vectors?

Note that, in this question, we only consider the whole vector space not creating a basis for a subspace.

user642796
  • 53,641
Fan
  • 791
  • 7
    At first sight, this seems like a nonsensical question, but proving that all bases for a finite-dimensional linear space have the same number of elements is not completely trivial. The proof requires fairly complicated induction process. Luckily, almost every book on linear algebra gives a proof of this theorem, usually requiring a couple of pages of lemmas. E.g. MacLane/Birkhoff "Algebra", page 200, Serge Lang "Algebra", pages 140–141, Pinter "Abstract Algebra", page 287, and 10 other books in my list. – Alan U. Kennington Jul 05 '15 at 02:44
  • It's not obvious that a vector space can't have both a basis of size $ m $ and a basis of size $ n $, where $ m \neq n $, but this is proved in linear algebra books. (And arguably this is one of the deep insights of linear algebra, successfully defining the notion of "dimension".) If someone has a way of explaining this that makes it seem easy / obvious, I'd be interested in learning that. – littleO Jul 05 '15 at 02:49
  • 1
    Just for the record, here are my other references for this deep theorem. Ash, "Basic abstract algebra", pages 93–94, Cullen "a book of abstract algebra", page 85, Seth Warner, "Modern algebra", page 640, Franklin, "Matrix algebra", pages 35–36, Shilov, "Linear algebra", pages 40–41, Schneider/Barker, "Matrices and linear algebra", pages 125–126, Stoll, "Linear algebra and matrix theory", page 40, Curtis, "Linear algebra", page 37, Hartley/Hawkes, "Rings, modules and linear algebra", page 101, Kaplan/Lewis, "Calculus and linear algebra", pages 117 and 677. – Alan U. Kennington Jul 05 '15 at 02:57

5 Answers5

3

Yes, that is precisely the definition of the dimension. The number of vectors you need in a basis

TY Mathers
  • 19,533
  • So if I have n-1 vectors in my span, in the best case scenario, I can only make it a basis of a subspace of R^n? Therefore, the set of n-1 vectors is impossible to be a basis for the entire n-dimensional vector space R^n? – Fan Jul 05 '15 at 02:40
  • But, we must prove that "dimension" is well-defined. – littleO Jul 05 '15 at 02:44
  • Exactly. For example, if you try to make a basis of two vectors in $\mathbb{R}^3$, you'll just get a subspace that's isomorphic to $\mathbb{R}^2$ – TY Mathers Jul 05 '15 at 02:46
  • What do you mean? It's only a linear algebra question. And the example is R^n in this case. – Fan Jul 05 '15 at 02:46
  • @mathers101 Do we have any theorem to support this claim? – Fan Jul 05 '15 at 02:46
  • @Fan Well, there's a theorem that states that any two vector spaces with the same dimension are isomorphic. So yes. – TY Mathers Jul 05 '15 at 02:48
1

Definition of dimension of a vector space is number of linearly independent vectors which will span the vector space. n-1 vectors may be linearly independent but they can not span the vector space.

0

If the columns of the $n\times n$ identity matrix were each a linear combination of a bunch of $n-1$ vectors $c_1$, $\ldots$, $c_{n-1}$, then the determinant of the identity matrix, using the multi-linearity and skew-symmetry, would be $0$.

Works for any commutative ring with $1$.

orangeskid
  • 56,630
  • Yes, but the properties of the determinant are proved in much the same way as the proof that bases of a finite-dimensional linear space have the same cardinality. In linear algebra, it's very easy to accidentally use a theorem to prove itself in a cyclic fashion. I know this because I'm writing a little book about it right now. – Alan U. Kennington Jul 05 '15 at 03:03
  • 1
    Yes, that's right. But where do the "basic properties" of determinants come from? They are proved by the same methods as the properties of bases of finite dimensional linear spaces. – Alan U. Kennington Jul 05 '15 at 03:22
0

There is this theorem about vector spaces which says that "If V has a basis with n elements then every set of vectors in V which has more than n elements is linearly dependent" . So Let W be a basis of V and S be a subset of it. So W and S are both Linearly Independent. We have to prove that S can never span V. Keeping the above theorem in mind , we first on the contrary , assume that S also spans V . Then , by definition of basis , S is a basis of V. So by above theorem , any Set with size greater that size of Basis Set , is Linearly dependent , which contradicts that V is Linearly Independent . Hence , Must a basis for an n-dimensional vector space have n vectors. Proved.

-1

The definition of basis of $\Bbb R^m$ is a set of vectors that are both linearly independent and spans $\Bbb R^m$. Assume that there are n vectors. Those n vectors in $\Bbb R^m$ form a matrix $[n_1 n_2 ...... n_n]$. If those vectors are linearly independent, that means that there must be n pivots in each column(because if there is a column without a pivot, there is a free variable, which leads to linear dependence).

That shows n <= m.(if n>m, there will be columns without pivots and again linear dependence) If you are not convinced, write a $3 \times 2$ or $4 \times 2$ matrix and see if they are linearly dependent. Analyzing the second definition--those vectors span $\Bbb R^m$.

If that's the case, then the matrix will have a pivot in each row(if there is a row without a pivot, which has the form $[0,0,\cdots , 0]$, then the matrix can't span $\Bbb R^m$.) That gives us that $n \geq m$ .(again, try out some matrices that $n < $ m to see whether they span IR^m). combine those two conclusions, we have $n\leq m $ and $n \geq m$. So n must equal to m. Therefore a basis has to have the form of $n \times n$.

Faust
  • 5,817