2

I have a question regarding the tensor product between vector spaces. In general it only captures the composite state based on basis states of the corresponding vector spaces V and W. In order to get "rid off" the tensor symbol, we have consider an isomorphism, which maps the $V \otimes W$ to some space, where a concrete multiplication is carried out.

Consider this example:

$$ \left( \begin{array}{c} a \\ b \\ \end{array} \right) \otimes \left( \begin{array}{c} c \\ d \\ \end{array} \right) = \left( \begin{array}{c} ac \\ ad \\ bc \\ bd \\ \end{array} \right)$$

So my intuition is that the tensor product is some kind of abstract space which captures the composition of the individual basis states. Computations are carried out, if you consider appropriate isomorphisms?

According to this post , the outer product between two vectors stacks the resulting basis coefficients as matrix and not as vector. Which representation is more suited.

Sarah
  • 371
  • Tensor products are useful when you consider not only vector spaces but modules over general rings. Things like $M \otimes \Bbb Z / p\Bbb Z \simeq M / pM$ are not directly interpretable via basis. – WhatsUp Jun 08 '23 at 13:29
  • Thank you, but is my intuition heading in the right direction? – Sarah Jun 08 '23 at 13:48
  • "So my intuition is that the tensor product is some kind of abstract space which captures the composition of the individual basis states." Yes, but this description isn't unique to tensor products. For example, the same can be said about the direct sum $V \oplus W$ of vector spaces. – Ben Grossmann Jun 08 '23 at 19:57
  • But the ismorphism is then about addition rather than mutliplication? In applications the tensorproduct is often computed as outerproduct for vectors and kroenecker product for matrices. How is this justified? – Sarah Jun 09 '23 at 06:43
  • @BenGrossmann: But the direct sum of vector spaces does scalar multiplication in a different way, so the the representation in the above example can never occur. – Sarah Jun 09 '23 at 08:58
  • @Sarah Right. I'm was just saying that your intuition, as you described it, fails to differentiate between the two concepts – Ben Grossmann Jun 09 '23 at 13:42
  • @Sarah One justification for the Kronecker product is the one that I describe here. I'm not sure if that's what you're asking about – Ben Grossmann Jun 09 '23 at 13:53

1 Answers1

1

I hope i understand the question correctly. You have stated that:

So my intuition is that the tensor product is some kind of abstract space which captures the composition of the individual basis states.

As Ben Grossmann pointed out in the comments, this property is not unique to the tensor product. In fact the defining property for the tensor product is the universal property which determines the tensor product up to natural isomorphism. Of course when we want to do specific calculations we have to chose one of the many concrete realizations of the tensor product. Which answers the question:

Computations are carried out, if you consider appropriate isomorphisms?

Now which realization is best suited depends on the context. There is a special realization for the tensor product of finite dimensional vector spaces when one of them is an inner product space:

Let $V,W$ be real vector spaces of dimension $n,m$. Consider the map \begin{align*} J: V \otimes W &\longrightarrow L(V^\prime , W) \\ \sum_{i=1}^{N} v_i \otimes w_i &\longmapsto \big( \varphi \mapsto \sum_{i=1}^N \varphi(v_i) w_i \big) \end{align*} which can be shown to be an isomorphism. Here $V^\prime$ is the dual space of $V$ and $L(V^\prime,W)$ denotes the vector space of linear maps from $V^\prime$ to $W$. Now if $V$ is an inner product space then we have a natural isomorphism $ V\to V^\prime$ and we can compose the natural isomorphism and $J$ to obtain a natural isomorphism \begin{align*} J_2: V \otimes W &\longrightarrow L(V , W) \\ \sum_{i=1}^{N} v_i \otimes w_i &\longmapsto \big( x \mapsto \sum_{i=1}^N \langle x, v_i \rangle w_i \big), \end{align*} where $\langle \cdot , \cdot \rangle $ is the inner product. Now suppose $V = \mathbb{R}^n$ and $W = \mathbb{R}^m$ with the natural inner product. Now fix $v = \sum_{i=1}^n \alpha_i e_i$ and $w = \sum_{j=1}^m \beta_j e_j$ (where the $\alpha_i , \beta_j \in \mathbb{R}$ and the $e_i$ are the natural basis). Then we have $$J_2(v \otimes w)(e_k) = \langle v, e_k \rangle w = \sum_{j=1}^m \alpha_k \beta_j e_j.$$ And so the matrix $(M_{jk}) \in \mathbb{R}^{m \times n}$ of $J_2(v\otimes w)$ in terms of the natural basis is given by $M_{jk}= \alpha_k \beta_j$. Now applying the tensor to a vector is just plain old matrix multiplication (perfect for writing computer programs). Note also that $J_2(e_i \otimes e_j) = (\delta_{ji})$ (the Kronecker-$\delta$ matrix). Of course we could also "flatten" the matrix $(M_{ij})$ to a vector like in your example, but this is less natural, which answers the following question (at least if we want to view the tensor as a linear map):

According to this post , the outer product between two vectors stacks the resulting basis coefficients as matrix and not as vector. Which representation is more suited?

I will supply below a case where we do want to flatten the matrix associated to a tensor. This realization of the tensor product is also related to the outer product acting on vectors via $$\otimes_{\mathrm{out}}: W \times V \overset{\otimes}{\to} W \otimes V \overset{tr}{\to} V \otimes W \overset{J_2}{\to} L(V,W) \cong \mathbb{R}^{m \times n},$$ where $tr$ is the transpose map and "$\cong$" means natural isomorphism. This is exactly the definition of the outer product of two vectors. Note that the (linearization of the) outer product also just sends a tensor $\sum_{i,j}\alpha_{ij} e_i \otimes e_j$ to its coefficient matrix $(\alpha_{ij}))$, which is perhaps the most obvious way to associate a matrix to a tensor.

The Kronecker product

One case where we do want to flatten the matrix associated to a tensor is the following case:

Let $V,W, X,Y$ be vector spaces and $T : V \to X$, $S: W\to Y$ linear maps. Then there exists a unique linear map $T \otimes S : V \otimes W \to X \otimes Y$ that satisfies $T\otimes S ( v \otimes w) = (Tv) \otimes (Sw)$ for all $v \in V, w \in W$. Note that $S \otimes T$ is just a notation here and (for the moment) not directly related to the actual tensor product map.

Now let $V= \mathbb{R}^n, W = \mathbb{R}^q, X = \mathbb{R}^m, Y = \mathbb{R}^p$ and let $(T_{ij})$, $(S_{ij})$ be the matrix of $T,S$ with respect to the natural basis. Our goal is to find a matrix multiplication that is equivalent to applying $T\otimes S$ to a tensor. We know that the tensors of the form $e_i \otimes e_j$ are a basis of both $V \otimes W$ and $X \otimes Y$ (with $i,j$ ranging appropriately and $e_j$ the $j$-th natural basis vector as before). We can compute \begin{equation} \tag{1} T \otimes S (e_i \otimes e_j) = (T e_i ) \otimes (S e_j) = \sum_{k} \sum_{l}T_{ki} S_{lj} e_k \otimes e_l, \end{equation} which uniquely determines a matrix $(T_{ij}) \otimes_{\mathrm{Kron}} (S_{ij}) \in \mathbb{R}^{pm \times qn}$ so that the following diagram commutes, where $T \otimes_{\mathrm{kron}} S$ is the matrix multiplication with $(T_{ij}) \otimes_{\mathrm{kron}} (S_{ij})$ map and $\mathrm{vec}$ is the "flatten" map: $$\require{AMScd} \begin{CD} V \otimes W @>{T \otimes S}>> X \otimes Y\\ @VV \cong V @VV \cong V \\ \mathbb{R}^{q\times n} @. \mathbb{R}^{p\times m}\\ @VV \mathrm{vec} V @VV \mathrm{vec} V \\ \mathbb{R}^{q n} @>{T \otimes_{\mathrm{kron}} S}>> \mathbb{R}^{p m}\\ \end{CD}$$ It follows from the definition and equation 1 that $(T_{ij}) \otimes_{\mathrm{Kron}} (S_{ij})$ is infact the Kronecker product of $(T_{ij})$ and $(S_{ij})$.

Edit to respond to comments

It can be shown that for any tensor $t \in V \otimes W$ there exist $N \in \mathbb{N}$ and $v_1, \dots , v_N \in V$ as well as $w_1, \dots , w_N \in W$ so that $t = \sum_{i=1}^{N} v_i \otimes w_i$. This follows from the uniqueness assertion in the universal property and the fact that $\otimes$ is bilinear. Note that here $N$ is some natural number and in general $N \neq \dim V \dim W$. Furthermore the $v_i$ and $w_i$ are in general not a basis of $V$ resp. $W$. Of course given $t \in V \otimes W$ there do not in general exist $v \in V, w \in W$ so that $t = v\otimes w$. So in general the sum over different simple tensors is necessary.

Since any tensor can be written like this it makes sense to define a map on this representation of a tensor (as done for $J$). Note that it is actually not obvious that a map defined in this way is actually a well defined function. To show that $J$ is well defined we actually use the universal property: Define \begin{align} h: V \times W &\longrightarrow L(V^\prime, W )\\ (v,w) &\longmapsto ( \varphi \mapsto \varphi(v) w ). \end{align} Then $h$ is bilinear and so the universal property gives us a unique $\tilde{h}: V \otimes W \to L(V^\prime, W)$ linear so that $ h = \tilde{h} \circ \otimes$. Then as you already discovered $\tilde{h}=J$, which shows that $J$ is well defined (but neither injectivity nor surjectivity).

Now to answer the question about the definition of $J_2$: Let $g: V \to V^\prime$ be the natural isomorphism. Then we can define an isomorphism $F$ by \begin{align} F: L(V^\prime , W) &\longrightarrow L(V,W) \\ T &\longmapsto T\circ g \end{align} And then we define $J_2 = F \circ J$.

jd27
  • 3,489
  • Thank you very much. I have some questions. The upper summation index $N$ is meant to be $N = n m$. Therefore it considers all combinations of pure tensor products between the respective basis vectors? – Sarah Jun 10 '23 at 12:09
  • Why do you some these tensors in $J$ and $J_2$ – Sarah Jun 10 '23 at 12:11
  • How do you compose $J$ and $h: V \rightarrow V'$ to get $J_2$. To you consider something like this: $h^{-1} \circ J$ – Sarah Jun 10 '23 at 12:47
  • s Ben Grossmann pointed out in the comments, this property is not unique to the tensor product. In fact the defining property for the tensor product is the universal property which determines the tensor product up to natural isomorphism. Of course when we want to do specific calculations we have to chose one of the many concrete realizations of the tensor product. Which answers the question:

    So the isomorphism is $J$ which corresponds to $\widetilde{h}$ in the link?

    – Sarah Jun 10 '23 at 12:58
  • 1
    @Sarah i edited my answer to respond to these questions – jd27 Jun 10 '23 at 13:40
  • So to get it right. The isomorphism which helps to "compute" the tensor product is given by $\widetilde{h}$ and justified by the universal property? – Sarah Jun 10 '23 at 13:53
  • Thank you very much for this long answer. I will need a bit of time to get a full understanding. Regarding the last example about the kronecker product, it justifies that the matrix obtained by the kronecker product is the the natural linear operator that acts on the tensor product space? – Sarah Jun 10 '23 at 14:13