8

From this entry in Wikipedia:

The tensor product of two vector spaces $V$ and $W$ over a field $K$ is another vector space over $K$. It is denoted $V\otimes_K W$, or $V\otimes W$ when the underlying field $K$ is understood.

If $V$ has a basis $e_1,\cdots,e_m$ and $W$ has a basis $f_1,\cdots,f_n$, then the tensor product $V\otimes W$ can be taken to be a vector space spanned by a basis consisting of all pairs $(e_i,f_j)$; each such basis element of $V\otimes W$ is denoted $e_i\otimes f_j$. For any vectors $v=\sum_i v_ie_i\in V$ and $w=\sum_j w_j f_j\in W$ there is a corresponding product vector $v\otimes w$ in $V\otimes W$ given by $\sum_{ij}v_iw_j(e_i\otimes f_j)\in V\otimes W.$ This product operation $\otimes:V\times W \rightarrow V\otimes W$ is quickly verified to be bilinear.

As an example, letting $V=W=\mathbb R^3$ (considered as a vector space over the field of real numbers) and considering the standard basis set $\{\hat x, \hat y,\hat z\}$ for each, the tensor product $V\otimes W$ is spanned by the nine basis vectors

$\{\hat x \otimes \hat x,\hat x \otimes \hat y,\hat x \otimes \hat z,\hat y \otimes \hat x,\hat y \otimes \hat y, \hat y \otimes \hat z ,\hat z\otimes \hat x,,\hat z \otimes \hat y, \hat z \otimes \hat z \}$ and is isomorphic to $\mathbb R^9$.

For vectors $v=(1,2,3),w=(1,0,0)\in \mathbb R^3$ the tensor product $$\bbox[10px, border:2px solid red]{v\otimes w= \hat x\otimes \hat x + 2\hat y\otimes \hat x+3\hat z\otimes \hat x}$$

The above definition relies on a choice of basis, which can not be done canonically for a generic vector space. However, any two choices of basis lead to isomorphic tensor product spaces (c.f. the universal property described below). Alternatively, the tensor product may be defined in an expressly basis-independent manner as a quotient space of a free vector space over $V\times W$. This approach is described below.


QUESTION:

If we decide on the standard Euclidean orthonormal basis, what is the final expression of the $v\otimes w$ product in the red boxed expression? Do we eventually get rid of the vector expressions with hats (as well as the $\otimes$ symbols) to get a number as per the (approximate) idea of a tensor as a map from $V\times W\rightarrow \mathbb R?$

What if we change the bases from orthonormal to $\large\begin{bmatrix}\tilde x\\\tilde y\\\tilde z\end{bmatrix}=\begin{bmatrix}3&4&-1\\0&3&7\\1&3&0.5\end{bmatrix}\begin{bmatrix}\hat x\\\hat y\\\hat z\end{bmatrix}?$

  • 2
    The final expression is exactly what's in the box: the notation $\hat{x}, \hat{y}, \hat{z}$ was introduced as the standard basis. There is nothing to get rid of. Would you rather write $\hat{x}$ as $e_1$, $\hat{y}$ as $e_2$, and $\hat{z}$ as $e_3$? A tensor is not a (bilinear) map $V \times W \rightarrow \mathbf R$. That is a description that some authors use and it involves an abuse of duality. I find the way of describing a tensor product in the excerpt here to be hopelessly vague. – KCd Feb 24 '17 at 03:57
  • What if I modify the OP to give new bases? – Antoni Parellada Feb 24 '17 at 03:58
  • 1
    Do you mean something like (1.1) on http://www.math.uconn.edu/~kconrad/blurbs/linmultialg/tensorprod.pdf and comparing it to the computation 4 lines lower? You're not going to compute out a tensor product of two vectors in some basis and eventually get rid of the tensor product symbols. – KCd Feb 24 '17 at 04:03
  • @KCd I think your hint help me a great deal (see answer). Thank you! I'm still intrigued by your criticism of the definition of tensor as a map to the reals (field). What would be your English definition? – Antoni Parellada Feb 24 '17 at 15:46
  • 1
    Saying a tensor is a map to the reals is like saying a vector is a map to the reals: it is confusing a vector space $V$ with its dual space (set of linear maps from $V$ to the reals). An English definition is... hard. The tensor product is the first construction in math that can not be understood well in math without using its universal mapping property. It's the univ. mapping property that drives everything about tensor products, as far mathematicians are concerned. – KCd Feb 24 '17 at 19:21
  • @kCd this idea of maps to reals (or fields) messed up my understanding for so long, and proliferates the tensor-holor confusion by blindly showing a single use of the tensor product, which makes one think that nothing about tensors HAS to be basis invariant. – Pineapple Fish Jun 29 '19 at 15:29
  • @KCd: So, a tensor $c_i a \otimes b$ represents ( under some choice of basis) a bilinear map f with $f(a,b)=c_i $? How would we represent as a matrix , under a choice of basis , say the "standard" $e_i =\delta_i^j $ , say the dot product ? – MSIS Oct 30 '19 at 00:25
  • (Cont.) And what is the map that takes a p-linear map $P(v_1,v_2,..,v_k) : (V_1 \times V_2 \times...\times V_k) \rightarrow K$ ; $K$ the base field into a linear map L on( $ V_1 \otimes V_2 ....\otimes V_k )\rightarrow K $? – MSIS Oct 30 '19 at 00:44
  • @MSIS I think that leaving aside the basis vectors, and focusing on the coefficients, the dot product would be $\sum_1^{3}a_ib^i $ in 3D. – Antoni Parellada Oct 30 '19 at 00:47
  • @AntoniParellada Parellada: Edit: Thanks, but I am interested in seeing a linear, say matrix representation of a k-linear map as a linear map in the tensor product. Is your description linear? If so, linear in what elements? Would that be on $ a_i \otimes b_i $ ? – MSIS Oct 30 '19 at 00:49

1 Answers1

3

Thanks for the hint in comments. It's more clear now what the answer would be:

enter image description here

Applied to the case in the QUESTION, the change of basis matrix is $\small\begin{bmatrix}3&4&-1\\0&3&7\\1&3&0.5\end{bmatrix}$, and its inverse $\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}$. The vectors $v$ and $w$ in the new coordinate system are

$v =\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\2\\3\end{bmatrix} =\begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}$ and $w=\small\begin{bmatrix}0.7&0.2&-1.1\\-0.3&-0.1&0.8\\0.1&0.2&-0.3\end{bmatrix}\begin{bmatrix}1\\0\\0\end{bmatrix}=\begin{bmatrix}0.7\\-0.3\\0.1\end{bmatrix}$.

Therefore,

$$\begin{align}\large v\otimes w&=\left(-2.3\tilde x + 1.9\tilde y -0.5 \tilde z\right)\otimes \left(0.7\tilde x -0.3\tilde y + 0.1\tilde z\right)\\&=-1.6\;\tilde x\otimes \tilde x + 1.3\;\tilde x\otimes \tilde y -0.3 \;\tilde x\otimes \tilde z + 0.6\;\tilde y\otimes \tilde x -0.5\;\tilde y\otimes \tilde y+ 0.1\;\tilde y\otimes \tilde z -0.3\;\tilde z\otimes \tilde x +0.2 \;\tilde z\otimes \tilde y-0.1\;\tilde z\otimes \tilde z\end{align}$$

So what's the point?

Starting off defining the tensor product of two vector spaces ($V\otimes W$) with the same bases, we end up calculating the outer product of two vectors:

$$\large v\otimes_o w=\small \begin{bmatrix}-2.3\\1.9\\-0.5\end{bmatrix}\begin{bmatrix}0.7&-0.3&0.1\end{bmatrix}=\begin{bmatrix}-1.61&0.69&-0.23\\1.33&-0.57&0.19\\-0.35&0.15&-0.05\end{bmatrix}$$

This connect this post to this more general question.

Bowei Tang
  • 3,763