1

I am looking for a concrete example for expressions like $$ V_A\otimes V_B = V_C\oplus V_D $$ that shows explicitly what the basis elements actually look like. My attempt was the following, lets take $V_A$ as subspace of $\mathbb R^3$, let it be the 1 dimensional space that spans the "x-line", with basis $B_A=\{e_x\}$. Let $V_B$ be the y-z plane with basis $B_B=\{e_y,e_z\}$. Let the new product space be $$ V_{AB} = V_A\otimes V_B $$ The basis for this new space should be two dimensional and is obtained by taking all possible tensor products of the composing vector space, $$ B_{AB} = \{e_x\otimes e_y, \ e_x\otimes e_z \} $$ I hope everything is correct up to this point. Now comes the part that I don't know how to do, the decomposition of this two dimensional space into two 1D spaces. Can we write $$ V_{AB} = V_{I} \oplus V_{II} $$ where $V_I, V_{II}$ are each 1D spaces ? If so, how do the bases look like ?

  • 1
    By writing down a basis for the tensor product, you've already answered the question as stated: take $V_I=\text{span}{\Bbb{R}}(e_x!\otimes!e_y)$ and $V{II}=\text{span}_{\Bbb{R}}(e_x!\otimes!e_z)$. Is that answer unsatisfying in some way? – Eric Nathan Stucky Dec 10 '21 at 13:07
  • @EricNathanStucky The following link shows that the symbol $\oplus$ is overloaded and I am not sure which one is the correct one here. The basis functions ink the link are extended with zeros. Should I do that or not ? Which is why I'd like an explicit example that clears that up. https://math.stackexchange.com/questions/601238/dimension-of-direct-sum-of-same-vector-spaces?rq=1 I am also unsure whether the $V_I$ are overloaded. Does the $V_I$ in the equation $V_{AB}=V_I\oplusV_{II}$ mean exactly the same as $V_I$ as 1D space ? – Hans Wurst Dec 10 '21 at 13:24
  • Relevant: https://math.stackexchange.com/questions/1223865/computing-bases-for-direct-wedge-tensor-products-etc-of-given-vector-spaces/1224039#1224039 – Travis Willse Dec 10 '21 at 14:18

1 Answers1

1

This is a response to the OP's comment, which is too long for a comment.

Conventions

The two objects denoted by $\oplus$ are sometimes called the "internal" and "external" direct sums. External direct sums always make sense for any two vector spaces, and elements are literally ordered pairs whose elements come from the summands. Internal direct sums require that the summands both live in a common ambient vector space, and their elements are literally vectors in that ambient space. In case the internal sum makes sense, one can prove that the map $V\oplus_{ext} W \to V\oplus_{int} W$ sending $(v,w)$ to $v+w$ is an isomorphism (and is the "best" kind of isomorphism in any sense you might mean that, e.g. functorial), so they are for all intents and purposes the same object.

In vector space decompositions such as yours, it is extremely common to use the symbol "=" to mean "isomorphic (in the best needed way)". This abuse of notation is very well-justified in practice, e.g. I may want to construct the tensor product as a set of matrices instead of writing down an abstract basis as you've done, and it's silly to let this "linguistic" difference get in the way.

But for the purposes of this question, it's clear that you mean we should both agree that $V\otimes W$ means $\text{span}_{\Bbb R} \{v\otimes w:v\in V, w\in W\}$, and that you mean "=" to mean "literally equal as sets". In this case, we must use the internal direct sum, since the left-hand side is not constructed set-theoretically as a direct sum (unless we have a very strange construction of the external direct sum).

Since you have constructed the ambient vector space $V_{AB}$, in which both $V_I$ and $V_{I\!I}$ live, this is not a problem. We simply need to find two subspaces of $V_{AB}$ with trivial intersection that span the space.


Construction

Literally speaking, $$ V_{AB} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) + b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes \begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a,b\in \Bbb{R}\right\}$$

Thus, one possible choice for $V_I$ and $V_{I\!I}$ would be $$ V_{I} = \left\{ a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) : a\in \Bbb{R}\right\}$$ $$ V_{I\!I} = \left\{ b\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : b\in \Bbb{R}\right\}.$$

The natural bases for these spaces are the obvious ones: just remove the remove the coefficients.

This is of course not the only choice*, but to address the other question in your comment, it is not even necessary that $V_I$ has dimension 1. It could just as easily be the zero subspace or the full $V_{AB}$ (leaving $V_{I\!I}$ to be the other one). However, because this is "boring", it is sometimes called the trivial direct sum decomposition. So in that sense, the answer to your question is yes: in your example, all nontrivial decompositions will have both summands of dimension 1.

* I say "of course" in the sense that there is the usual freedom that one has in (direct) sum constructions. For instance, a different choice would be $V_{I\!I}$ as before, but $$ V_{I} = \left\{ 3a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\1\\0\end{bmatrix}\right) - 2a\left(\begin{bmatrix}1\\0\\0\end{bmatrix}\otimes\begin{bmatrix}0\\0\\1\end{bmatrix}\right) : a\in \Bbb{R}\right\},$$ and other such things.

  • Parenthetically, it's worth noting that there is no "internal tensor product"; The fact that both $V_A$ and $V_B$ are the same space in your example doesn't buy us anything. – Eric Nathan Stucky Dec 10 '21 at 14:20
  • This helps. I was unsure how to expand $e_x$ for example. In its original 1D space I would have denoted it as $\left [ 1\right] $ and not as $\left[ \begin{matrix} 1 \0 \ 0 \end{matrix}\right]$. But that would have made little sense in the expressions for $e_x\otimes e_y$ as basis for our new space. I take it that the epxressions like $e_x\times 0 \times 0 $ refer to this . So $e_x \rightarrow e_x\times 0 \times 0 $ is equivalent to $\left [ 1\right] \rightarrow \left[ \begin{matrix} 1 \0 \ 0 \end{matrix}\right]$. – Hans Wurst Dec 10 '21 at 15:17
  • 1
    @Hans Wurst: I think your thought process is still a bit awry here. In your question you constructed $V_A$ explicitly as a subspace of $\Bbb{R}^3$, so calling $e_x$ by the name $[1]$ is literally wrong (although an understandable abuse of notation). On the other hand, there's nothing wrong with mixed expressions in tensor products (this is part of what I was getting at with "no internal tensor products"). If you had constructed $V_A=\Bbb{R}^1$ and $V_B=\Bbb{R}^2$, with no $\Bbb{R}^3$ involved, then $[1]\otimes \begin{bmatrix}1\0\end{bmatrix}$ is a perfectly fine element of $V_A\otimes V_B$. – Eric Nathan Stucky Dec 10 '21 at 16:41
  • And sorry, if this caused any confusion: in my first comment I should have said "...$V_A$ and $V_B$ are subspaces of the same space..."; they are of course different subspaces. – Eric Nathan Stucky Dec 10 '21 at 16:43
  • 1
    Indeed, I'm a chemist trying to get a proper handle on tensor product spaces since they crop up immediately in quantum mechanics. Details like this were never mentioned in my lectures so its extremely murky what these expressions mean in detail. Your explanation is very welcome. – Hans Wurst Dec 10 '21 at 16:52