0

Suppose I have a vector space $V$ of dimension $n$ with basis elements $a_1, \dots a_n$.

If I take the direct product of $V$, to get $V^k$ then (I'm assuming since I haven't actually seen this proved) that $\dim(V^k) = n^k$ and we'd have as basis elements $k$-tuples $$\left(a_{i_1}, \dots, a_{i_k}\right)$$ for every $k$-tuple $I = (i_1, \dots, i_k)$ of integers from the set $\{1, \dots , n\}$.

Now I want to believe the above, but here's an example that I think the above (conjecture?) fails. Take $V = \mathbb{R}$ over the field $\mathbb{R}$, then $V$ has basis $\{1\}$. However $V^2 = \mathbb{R}^2$ has basis $\{(1, 0), (0, 1)\}$, but by the above I'd end up with $\dim(V^2) = 1^2 = 1\neq 2$ so there's a contradiction.

Does my above conjecture ever hold? If so under what constraints?

Perturbative
  • 13,656
  • 2
    Rather $k\cdot n$, using $(a_1,0,...,0),...,(a_n,0,...,0),...,...,...,(0,...,0,a_n)$. –  Jul 19 '18 at 18:43
  • If $n$ is large enough, then your set of vectors contains $v_1=(a_1,a_2,...,a_2)$, $v_2=(a_2,a_2,...,a_2)$, $v_3=(a_1,a_3,...,a_3)$, and $v_4=(a_2,a_3,...,a_3)$. Then $v_1-v_2+v_3-v_4=0$ is a non-trivial combination of $4$ of them that results in the zero vector. Therefore, they are linearly dependent. –  Jul 19 '18 at 18:53

2 Answers2

2

You can use Grassman formula which states: $$\dim(U+V) = \dim(U)+\dim(V)-\dim(U \cap V)$$ If $V$ and $U$ are in direct sum with each other then $$U\cap V = \{\emptyset\}$$ so the formula becomes $$\dim(U+V)=\dim(U)+\dim(V)=\dim(U\oplus V)$$ This can be generalised in such way:

Let $V$ and $U$ be finite dimensional vector spaces over a field $K$ then $$\dim(U\oplus V)=\dim(U)+\dim(V)$$

In your example $$\mathbb{R}^n=\mathbb{R}\oplus\mathbb{R}\oplus\dots\oplus\mathbb{R} = n\dim(\mathbb{R})$$

For a more in-depth analysis see this forum post

0

Hint: $\displaystyle\dim(V_1\times V_2\times\cdots\times V_n)=\dim(V_1)+\dim(V_2)+\cdots+\dim(V_n).$