3

XylyXylyX has a great series of videos on manifolds and tensors. I would like to confirm a couple of points that are probably implied (or misunderstood (by me)). It makes reference to this point in the definition of tensor product:


enter image description here

... in the lower left-hand corner of the slide, he is illustrating the product tensor $e^1 \otimes e^2$ operating on a pair of vectors $A^\mu\,e_\mu, B^\nu\,e_\nu \in V,$ resulting in the real number $[e^1 \otimes e^2](A^\mu e_\mu, B^\nu e_\nu)=A^1 B^2.$

I would like to ask you to confirm two things:

  1. Even though $A^1 B^2$ is a real number, if we were to carry out the complete tensor product $[*]$, we'd have to express it as

$$A^0 B^0 e_0 \otimes e_0 + A^0 B^1 e_0 \otimes e_1 + \cdots + A^4 B^4 e_4 \otimes e_4.$$

In other words, it wouldn't be just a single real number, but rather more like a matrix.

This seems consistent with the result in the Wikipedia example of the tensor product of $v = \begin{bmatrix}1& 2& 3 \end{bmatrix}$ and $w = \begin{bmatrix}1 & 0 & 0 \end{bmatrix}:$

$$v\otimes w=\hat x \otimes \hat x + 2 \hat y\otimes \hat x + 3 \hat z \otimes\hat x $$

although there are no apparent covectors in this Wikipedia example, possibly explaining the difference.


$[*]$ NOTE that this may be the main source of my misunderstanding: In the example on the slide posted, he picks out $e^1\otimes e^2$, but I don't know if one has to continue operating on the $15$ additional $e^i\otimes e^j$ pairs.


  1. When the covectors (linear functionals) are not just the basis of $V^*,$ and they have coefficients, the tensor product $\beta \otimes \gamma$ operating on these same two vectors would look like

$$\beta_0 \gamma_0 A^0 B^0 e_0 \otimes e_0 + \beta_1 \gamma_0 A^1 B^0 e_1 \otimes e_0 + \cdots + \beta_4 \gamma_4 A^4 B^4 e_4 \otimes e_4$$

with $\beta_i, \gamma_i$ corresponding to the coefficients of the covectors $\beta, \gamma \in V^*.$

2 Answers2

3

ATTEMPT TO ORGANIZE THIS MATERIAL (now validated by Prof. Shifrin - at least in part; errors, mine):


In the Wikipedia example of the tensor product vector spaces, included in my OP, as well as in my previous post here, the tensor product is of the form $V\otimes V,$ a $(0,2)$ tensor, and results in a form akin to $(1)$ in the OP:

$$A^0 B^0 e_0 \otimes e_0 + A^0 B^1 e_0 \otimes e_1 + \cdots + A^4 B^4 e_4 \otimes e_4$$

equivalent to an outer product, as illustrated in this post:

The tensor product of two vectors $v\in V$ and $w \in W$, i.e. $(V\otimes W)$ is akin to calculating the outer product of two vectors:

$$\large v\otimes_o w=\small \begin{bmatrix}-2.3\;e_1\\+1.9\;e_2\\-0.5\;e_3\end{bmatrix}\begin{bmatrix}0.7\;e_1&-0.3\;e_2&0.1\;e_3\end{bmatrix}= \begin{bmatrix}-1.61\;e_1\otimes e_1&+0.69\;e_1\otimes e_2&-0.23\;e_1\otimes e_3\\+1.33\;e_2 \otimes e_1&-0.57\;e_2 \otimes e_2&+0.19\;e_2 \otimes e_3\\-0.35\;e_3 \otimes e_1&+0.15\;e_3 \otimes e_2&-0.05\;e_3 \otimes e_3\end{bmatrix}$$


This is equivalent to the tensor product space $V^*\otimes V^*$ (the set of all tensor $(2,0))$ on the slide in the OP. The presenter is tensor-multiplying two co-vectors in the vector basis of $V^*$, without coefficients, yielding the $16$ pairs of basis vectors of $V^*\otimes V^*$: $$\{e^0\otimes e^0, \; e^0\otimes e^1, \; e^0\otimes e^3, \;\cdots, e^4\otimes e^4\}.$$


The key is to distinguish these forms of tensor product of vector spaces from their application to other vectors (or covectors), i.e. when the $$\langle\beta_\mu\,e^\mu\;,\;A^\nu\,e_\nu \rangle\;=\beta_\mu\,A^\nu\,\langle e^\mu\;,\;e_\nu\rangle \;=\beta_\mu\,A^\nu\,\delta^\mu_{\;\nu}\;=\beta_\mu\,A^\mu\;\in \mathbb R$$ operations are carried out, yielding a real number - which is what is explained in the video.

These linear mappings $\beta\otimes\gamma:V\times V \to \mathbb R$ properly interpreted as $[\beta\otimes\gamma](v,w)=\langle \beta,v\rangle\langle\gamma,w\rangle$ (i.e. the tensor $\beta\otimes\gamma$ acting on two vectors, $v$ and $w$) would correct the $(2)$ part of the OP (after Professor Shifrin's answer) as:

$\begin{align} &(\beta\otimes\gamma)\left(\sum A^\mu e_\mu,\sum B^\nu e_\nu\right)= \\[2ex] &=\left [ \beta_0\gamma_0\;e^0\otimes e^0+ \; \beta_0\gamma_1\;e^0\otimes e^1+ \;\beta_0\gamma_3\; e^0\otimes e^3+\cdots+ \;\beta_4\gamma_4\; e^4\otimes e^4 \right]\,\small{\left(\sum A^\mu e_\mu,\sum B^\nu e_\nu\right) } \\[2ex] &= \beta_0\gamma_0 A^\mu B^\nu \langle e^0,e_\mu \rangle \; \langle e^0,e_\nu \rangle \; + \; \beta_0\gamma_1 A^\mu B^\nu \langle e^0,e_\mu \rangle \; \langle e^1,e_\nu \rangle +\cdots +\beta_4\gamma_4 A^\mu B^\nu \langle e^4,e_\mu \rangle \; \langle e^4,e_\nu \rangle \\[2ex] &=\beta_0\gamma_0 A^\mu B^\nu \; \delta^0_{\;\mu}\; \delta^0_{\;\nu} \; + \; \beta_0\gamma_1 A^\mu B^\nu \; \delta^0_{\;\mu}\; \delta^1_{\;\nu} +\cdots +\beta_4\gamma_4 A^\mu B^\nu \; \delta^4_{\;\mu}\; \delta^4_{\;\nu} \\[2ex]&= \sum \beta_\mu\gamma_\nu A^\mu B^\nu \end{align}$

indeed a real number, exemplifying the mapping $V\times V \to \mathbb R.$ The tensor is defined as

$$\begin{align}\beta\otimes \gamma&:= \beta_0\gamma_0\, e^0\otimes e^0+\beta_0\gamma_1\, e^0\otimes e^1 + \beta_0\gamma_2\, e^0\otimes e^2+\cdots+\beta_3\gamma_3\, e^3\otimes e^3\\[2ex] &=T_{00}\, e^0\otimes e^0+T_{01}\, e^0\otimes e^1 + T_{02}\, e^0\otimes e^2+\cdots+T_{33}\, e^3\otimes e^3\\[2ex] &= T_{\mu\nu}\,e^\mu\otimes\,e^\nu \end{align}$$


As an example, I believe we could illustrate this as follows:

$\beta \in V^*$ is $\beta=\color{blue}{\begin{bmatrix}\sqrt{\pi} & \sqrt[3]{\pi} &\sqrt[5]{\pi} \end{bmatrix}}$ and $\gamma\in V^*$ is $\gamma=\color{red}{\begin{bmatrix}\frac{1}{3} &\frac{1}{5} &\frac{1}{7} \end{bmatrix}}$. The $(2,0)$-tensor $\beta\otimes \gamma$ is the outer product:

$$\begin{align}\beta\otimes_o \gamma&= \begin{bmatrix}\color{blue}{\sqrt\pi}\times \color{red}{\frac{1}{3}}\quad e^1\otimes e^1 &\color{blue}{\sqrt\pi}\times\color{red}{\frac{1}{5}}\quad e^1\otimes e^2 &\color{blue}{\sqrt\pi}\times\color{red}{\frac{1}{7}}\quad e^1\otimes e^3\\ \color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{3}}\quad e^2\otimes e^1 &\color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{5}}\quad e^2\otimes e^2 &\color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{7}}\quad e^2\otimes e^3 \\\color{blue}{\sqrt[5]{\pi}}\times\color{red}{\frac{1}{3}}\quad e^3\otimes e^1 &\color{blue}{\sqrt[5]{\pi}}\times\color{red}{\frac{1}{5}}\quad e^3\otimes e^2 &\color{blue}{\sqrt[5]{\pi}}\times \color{red}{\frac{1}{7}}\quad e^3\otimes e^3\end{bmatrix}\\[2ex] &=\begin{bmatrix}\color{red}{\frac{1}{3}}\color{blue}{\sqrt\pi}\quad e^1\otimes e^1&\color{red}{\frac{1}{5}}\color{blue}{\sqrt\pi}\quad e^1\otimes e^2&\color{red}{\frac{1}{7}}\color{blue}{\sqrt\pi}\quad e^1\otimes e^3\\\color{red}{\frac{1}{3}}\color{blue}{\sqrt[3]{\pi}}\quad e^2\otimes e^1&\color{red}{\frac{1}{5}}\color{blue}{\sqrt[3]{\pi}}\quad e^2\otimes e^2&\color{red}{\frac{1}{7}}\color{blue}{\sqrt[3]{\pi}}\quad e^2\otimes e^3\\\color{red}{\frac{1}{3}}\color{blue}{\sqrt[5]{\pi}}\quad e^3\otimes e^1&\color{red}{\frac{1}{5}}\color{blue}{\sqrt[5]{\pi}}\quad e^3\otimes e^2&\color{red}{\frac{1}{7}} \color{blue}{\sqrt[5]{\pi}}\quad e^3\otimes e^3\end{bmatrix} \end{align}$$

This is not commutative:

The $(2,0)$-tensor $\gamma \otimes \beta$ would instead result in:

$$\begin{align}\gamma\otimes_o \beta&= \begin{bmatrix} \color{red}{\frac{1}{3}} \times \color{blue}{\sqrt\pi}\quad e^1\otimes e^1 &\color{red}{\frac{1}{3}} \times \color{blue}{\sqrt[3]\pi}\quad e^1\otimes e^2 &\color{red}{\frac{1}{3}} \times \color{blue}{\sqrt[5]\pi}\quad e^1\otimes e^3\\ \color{red}{\frac{1}{5}} \times \color{blue}{\sqrt\pi}\quad e^2\otimes e^1 &\color{red}{\frac{1}{5}} \times \color{blue}{\sqrt[3]\pi}\quad e^2\otimes e^2 &\color{red}{\frac{1}{5}} \times \color{blue}{\sqrt[5]\pi}\quad e^2\otimes e^3 \\\color{red}{\frac{1}{7}} \times \color{blue}{\sqrt\pi}\quad e^3\otimes e^1 &\color{red}{\frac{1}{7}} \times \color{blue}{\sqrt[3]\pi}\quad e^3\otimes e^2 &\color{red}{\frac{1}{7}} \times \color{blue}{\sqrt[5]\pi}\quad e^3\otimes e^3\end{bmatrix} \end{align}$$

Now if we apply the initial tensor product $\beta\otimes \gamma$ on the vectors

$$v=\color{magenta}{\begin{bmatrix}1\\7\\5\end{bmatrix}}, w = \color{orange}{\begin{bmatrix}2\\0\\3\end{bmatrix}}$$

\begin{align} (\beta \otimes \gamma)[v,w]=&\\[2ex] & \;\color{blue}{\sqrt\pi}\times \color{red}{\frac{1}{3}} \times \color{magenta} 1 \times \color{orange}2 \quad+\quad \color{blue}{\sqrt\pi}\times\color{red}{\frac{1}{5}} \times \color{magenta}1 \times \color{orange} 0 \quad+\quad \color{blue}{\sqrt\pi}\times\,\color{red}{\frac{1}{7}} \times \color{magenta}1 \times \color{orange}3 \\ + &\;\color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{3}} \times \color{magenta}{7} \times \color{orange}2 \quad+\quad \color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{5}} \times \color{magenta}{7} \times \color{orange}0 \quad+\quad \color{blue}{\sqrt[3]{\pi}}\times\color{red}{\frac{1}{7}} \times \color{magenta}{7} \times \color{orange}3 \\ \;+ &\;\color{blue}{\sqrt[5]{\pi}}\times\color{red}{\frac{1}{3}} \times \color{magenta} 5 \times \color{orange}2 \quad+\quad \color{blue}{\sqrt[5]{\pi}}\times\color{red}{\frac{1}{5}} \times \color{magenta} 5 \times \color{orange}0 \quad+\quad \color{blue}{\sqrt[5]{\pi}}\times \color{red}{\frac{1}{7}} \times \color{magenta}5 \times \color{orange}3 \\[2ex] =&\\ & \color{blue}{\sqrt{\pi}}\;\times\color{magenta} 1 \quad\left(\color{red}{\frac{1}{3}} \times \color{orange}2 \quad+\quad \color{red}{\frac{1}{5}} \times \color{orange} 0 \quad+\quad \color{red}{\frac{1}{7}} \times \color{orange}3\right) \\ + &\,\color{blue}{\sqrt[3]\pi} \times \color{magenta}{7}\quad\left(\color{red}{\frac{1}{3}} \times \color{orange}2 \quad+\quad \color{red}{\frac{1}{5}} \times \color{orange}0 \quad+\quad \color{red}{\frac{1}{7}} \times \color{orange}3\right) \\ \;+ &\,\color{blue}{\sqrt[5]{\pi}}\times \color{magenta} 5\quad\left(\color{red}{\frac{1}{3}} \times \color{orange}2 \quad+\quad \color{red}{\frac{1}{5}} \times \color{orange}0 \quad+\quad \color{red}{\frac{1}{7}} \times \color{orange}3 \right)\\[2ex] =&\\&\small \left(\color{blue}{\sqrt\pi} \times \color{magenta} 1 \quad+\quad \color{blue}{\sqrt[3]\pi} \times \color{magenta}{7} \quad +\quad \color{blue}{\sqrt[5]\pi} \times \color{magenta}5 \right) \times \left(\color{red}{\frac{1}{3}} \times \color{orange}2 \quad+\quad \color{red}{\frac{1}{5}} \times \color{orange} 0 \quad +\quad \color{red}{\frac{1}{7}} \times \color{orange} 3 \right)\\[2ex] =&\\[2ex]&\langle \color{blue}\beta,\color{magenta}v \rangle \times \langle \color{red}\gamma,\color{orange}w \rangle\\[2ex] =& 20.05487\end{align}

The elements of the first vector, $v,$ multiply separate rows of the outer product $\beta \otimes_o \gamma,$ while the elements of the second vector $w$ multiply separate columns. Hence, the operation is not commutative.

Here is the idea with R code:

> v = c(1,7,5); w = c(2,0,3); beta=c(pi^(1/2),pi^(1/3),pi^(1/5)); gamma = c(1/3,1/5,1/7)
> sum(((beta %o% gamma) * v) %*% w) 
># same as sum((beta %*% t(gamma) * v) %*% w)
># same as (t(beta) %*% v) * (t(gamma) %*% w)
[1] 20.05487
> sum(((beta %o% gamma) * w) %*% v) # not a commutative operation:
[1] 17.90857

Or more simply, $\vec \beta \cdot \vec v \times \vec \gamma \cdot \vec w = 308$

$$\begin{align} (\beta \otimes \gamma)[v,w]&=\langle \beta,v \rangle \times \langle \gamma,w \rangle\\[2ex] & =\small \left(\color{blue}{\sqrt\pi} \times \color{magenta} 1 \quad+\quad \color{blue}{\sqrt[3]\pi} \times \color{magenta}{7} \quad +\quad \color{blue}{\sqrt[5]\pi} \times \color{magenta}5 \right) \times \left(\color{red}{\frac{1}{3}} \times \color{orange}2 \quad+\quad \color{red}{\frac{1}{5}} \times \color{orange} 0 \quad +\quad \color{red}{\frac{1}{7}} \times \color{orange} 3 \right) \\[2ex] &=18.31097\times 1.095238\\[2ex] &= 20.05487\end{align}$$

> v = c(1,7,5); w = c(2,0,3); beta=c(pi^(1/2),pi^(1/3),pi^(1/5)); gamma = c(1/3,1/5,1/7)
> beta %*% v * gamma %*% w
         [,1]
[1,] 20.05487

Does it obey bilinearity?

$$(\beta\otimes \gamma)[v,w]\overset{?}=(\beta\otimes \gamma)\Bigg[\left(\frac{1}{5}v\right),\left(5\,w\right)\Bigg] $$

> v_prime = 1/5 * v
> w_prime = 5 * w
> beta %*% v_prime * gamma %*% w_prime
         [,1]
[1,] 20.05487   #Check!

$$(\beta\otimes \gamma)[v, u + w]\overset{?}=(\beta\otimes \gamma)[v,u] + (\beta\otimes \gamma)[v,w] $$

> u = c(-2, 5, 9)    # Introducing a new vector...
> beta %*% v * gamma %*% (u + w)
        [,1]
[1,] 49.7012
> (beta %*% v * gamma %*% u) + (beta %*% v * gamma %*% w)
        [,1]
[1,] 49.7012 #... And check!
2

I don't know what you mean by "carry out the complete tensor product" in 1. But the output should be a real number, regardless, if you apply a tensor of type $(2,0)$ to a pair of vectors. Similarly, in 2, the output should be a real number when you evaluate $(\beta\otimes\gamma)(\sum A^\mu e_\mu,\sum B^\nu e_\nu) = \sum \beta_\mu\gamma_\nu A^\mu B^\nu$.

(If you like, $(e^i\otimes e^j)(e_\mu,e_\nu) = \delta^i_\mu\delta^j_\nu$, as your slide says.)

EDIT: One thing to keep in mind is this. We use upper and lower indices precisely so that things that make sense will have compensating upper and lower indices. I.e., $\sum A^\mu e_\mu$ is a vector or $(0,1)$ tensor (one upper, one lower), $\sum A^\mu B^\nu e_\mu\otimes e_\nu$ is the $(0,2)$ tensor $v\otimes w$ (one $\mu$ up and one $\mu$ down, same for $\nu$), and $\sum \beta_\mu A^\mu$ is a real number (one $\mu$ up, one $\mu$ down), as is $\sum \beta_\mu\gamma_\nu A^\mu B^\nu$ (note compensating upper and lower indices for both $\mu$ and $\nu$). In 2. you have a surplus of lower indices, so it can't make sense. When we apply a $(2,0)$ tensor to a pair of vectors, we get a $(0,0)$ tensor, or scalar.

Ted Shifrin
  • 125,228
  • Thank for your answer. Please note the result of the example in Wikipedia, i.e. $v\otimes w=\hat x \otimes \hat x + 2 \hat y\otimes \hat x + 3 \hat z \otimes\hat x,$ which would be akin to my expression $(1)$ in the OP. As for $(2),$ can I interpret that it is correct? – Antoni Parellada Sep 19 '17 at 17:17
  • No, things are not correct. What "complete tensor product" do you mean in 1? If $v = \sum A^\mu e_\mu$ and $w=\sum B^\nu e_\nu$ are vectors, then in 1. you've written out the $(0,2)$ tensor $v\otimes w$. Is that what you intended? – Ted Shifrin Sep 19 '17 at 17:20
  • I meant not just applying $e^1 \otimes e^2,$ as in the example, to the pair of vectors, but the entire complement of the $16$ different $e^i \otimes e^j$ permutations listed on the slide. – Antoni Parellada Sep 19 '17 at 17:23
  • 1
    No, again, if you apply a $(2,0)$ tensor to a pair of vectors, you will get a real number, not a $(0,2)$ tensor, as the answer. – Ted Shifrin Sep 19 '17 at 17:24
  • I think I got it after your answer. I posted a pseudo-answer collecting my updated thoughts, and if you don't mind taking a look at it, and it reflects the thoughts in your answer accurately, I'll consider that I understood it properly, and move to accept your post. Thanks again. – Antoni Parellada Sep 19 '17 at 20:15
  • 1
    Yup, I think it's straightened out now. :) – Ted Shifrin Sep 19 '17 at 21:41
  • I wonder if you want to retouch the final $\sum$ expression in your post, re: the $\LaTeX$ for the $\nu$ for clarity... – Antoni Parellada Sep 19 '17 at 23:33
  • Sure, Antoni. Of course, you could have edited that yourself :) – Ted Shifrin Sep 19 '17 at 23:34