8

We say a function is $k$-linear if it takes $k$ values as input and is linear with respect to each of them. For example, determinant is a $n$-linear function. (If the matrix is $n \times n$)

A tensor is a function $T:V \times V\times V\times \dots\times V\to \mathbb R$ ($k$ vectors taken as input) such that $T$ is $k$-linear. (Its linear with respect to each of its $k$ inputs) ($V$ is a vector space)

A symmetric tensor is a tensor that is invariant under a permutation of its vector arguments. Meaning:
$T(v_1,v_2,\dots,v_r)=T(v_{\sigma(1)},v_{\sigma(2)},\dots,v_{\sigma(r)})$ For each permutation $\sigma$ of the symbols $\{1,2,\dots,r\}$.

We call $Sym^k(V)$ the vector space of all symmetric $k$-tensors on vector space $V$.

If $T$ is a $m$-tensor and $S$ is a $n$-tensor, then $T \otimes S$ is a $m+n$-tensor such that for each $(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})$ $T \otimes S(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})=T(v_1,\dots,v_m)S(v_{m+1},\dots,v_{m+n})$.

Now we want to find a basis for this vector space.

I know that the basis should consist of something related to the sum of tensor products of the elements of the basis of the dual space of $V$ (called $V^*$). But i can't see how. The complete Question is written below:

Let $V$ be an $n$-dimensional vector space. Compute the dimension of $Sym^k(V)$. (It can be also found here Page 33)

I know that every $k$-tensor can be written as a linear combination of $\{e^{i_1}\otimes e^{i_2}\otimes\dots\otimes e^{i_k}\}$ such that $1 \le i_1,\dots,i_k\le n$. But i don't know which members should be eliminated to form a basis for just the symmetric $k$-tensors (Not all $k$-tensors).

Note: For example, I know that if $k=3$, A member of the basis of $Sym^3(V)$ is $e^1\otimes e^2 \otimes e^1+e^2\otimes e^1 \otimes e^1+e^1\otimes e^1 \otimes e^2$. But i don't know why! I want explanation. I have the answer... The dimension of $Sym^k(V)$ is ${n+k-1} \choose k$. My problem is that i don't know why this sum and why the coefficient should be the same for those elements in the sum.

  • 3
    Hint: by symmetry you only need to know a $k$-linear function's behavior on the elementary tensors where $1\leq i_1 \leq i_2 \leq \cdots \leq i_k \leq n$. (By the way, different books have other conventions on what a $k$-tensor is. What you defined as a $k$-tensor lies in the $k$th tensor power of the dual space of $V$; for $k=1$, your 1-tensor is an element of the dual space of $V$, not an element of $V$. Often people want a $k$-tensor to be an element of the $k$th tensor power of $V$ itself.) – KCd Mar 11 '17 at 14:20
  • 1
    @KCd Can you please explain how the basis of $Sym^3(V)$ is made? – Arman Malekzadeh Mar 11 '17 at 17:05
  • 2
    Use the hint KCd gave you. – anon Mar 11 '17 at 17:54
  • 2
    What exactly is $e^i$ supposed to mean? – Ben Grossmann Mar 11 '17 at 18:00
  • 1
    @arctictern excuse me if i'm stupid but i can't understand what he's saying – Arman Malekzadeh Mar 11 '17 at 18:13
  • @Omnomnomnom $e^i$ is the linear functional with $e^i(e_j)=\delta^i_j$ – anon Mar 11 '17 at 18:15
  • @Omnomnomnom if ${e_1,\dots,e_n}$ is a basis for $V$, then ${e^i:i:1,\dots,n}$ is defined as $\forall i \space e^i(e_i)=1$ and for every other $j$, $e^i(e_j)=0$ – Arman Malekzadeh Mar 11 '17 at 18:15
  • @arctictern can you please explain? Its been 10 hours since i started to think about this question... – Arman Malekzadeh Mar 11 '17 at 18:16

2 Answers2

9

I assume that $e_1,\dots,e_n$ is a basis of $V$ and that $e^1,\dots,e^n$ the associated dual basis of $V^*$.

First, let's consider the case of arbitrary (not necessarily symmetric) tensors. We note that, by linearity, $$ T(v^{(1)}, \dots, v^{(k)}) = T\left( \sum_{i=1}^n v^{(1)}_i e_i, \dots, \sum_{i=1}^n v^{(k)}_i e_i \right) = T\left( \sum_{i_1=1}^n v^{(1)}_{i_1} e_i, \dots, \sum_{i_k=1}^n v^{(k)}_{i_k} e_{i_k} \right) = \\ \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n v^{(1)}_{i_1} \cdots v^{(k)}_{i_k} T\left(e_{i_1}, \dots, e_{i_k} \right) $$ Now, define the tensor $\tilde T$ by $$ \tilde T = \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n T\left(e_{i_1}, \dots, e_{i_k} \right) e^{i_1} \otimes \cdots \otimes e^{i_k} $$ Prove that $\tilde T(v^{(1)},\dots,v^{(k)}) = T(v^{(1)},\dots,v^{(k)})$ for any $v^{(1)},\dots,v^{(k)}$. That is, $\tilde T = T$. We've thus shown that any (not necessarily symmetric) $k$-tensor can be written as a linear combination of $e^{i_1} \otimes \cdots \otimes e^{i_k}$.

The same applies for symmetric tensors. However, if $T$ is symmetric, then $$ T\left(e_{i_1}, \dots, e_{i_k} \right) = T\left(e_{i_{\sigma(1)}}, \dots, e_{i_{\sigma(k)}} \right) $$ for any permutation $\sigma$. Thus, we may regroup the above sum as $$ T = \tilde T = \sum_{i_1=1}^n \cdots \sum_{i_k=1}^n T\left(e_{i_1}, \dots, e_{i_k} \right) e^{i_1} \otimes \cdots \otimes e^{i_k} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \; \frac 1{\alpha(i_1,\dots,i_k)}\sum_{\sigma \in S_k} T\left(e_{i_{\sigma(1)}}, \dots, e_{i_{\sigma(k)}} \right) e^{i_{\sigma(1)}} \otimes \cdots \otimes e^{{i_{\sigma(k)}}} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \; \frac 1{\alpha(i_1,\dots,i_k)}\sum_{\sigma \in S_k} T\left(e_{i_1}, \dots, e_{i_k} \right) e^{{i_{\sigma(1)}}} \otimes \cdots \otimes e^{{i_{\sigma(k)}}} = \\ \sum_{1 \leq i_1 \leq \cdots \leq i_k \leq n} \frac 1{\alpha(i_1,\dots,i_k)} T\left(e_{i_1}, \dots, e_{i_k} \right) \underbrace{\sum_{\sigma \in S_k} e^{{i_{\sigma(1)}}} \otimes \cdots \otimes e^{{i_{\sigma(k)}}}}_{\text{basis element for } Sym^k(V)} $$ Thus, we have expressed $T$ as a linear combination of the desired basis elements.


${\alpha(i_1,\dots,i_k)}$ counts the number of repeated times an element $e^{i_{\sigma(1)}}\otimes \dots\otimes e^{i_{\sigma(k)}}$ appears in the summation over $\sigma \in S_k$. As the comment below points out, we have $$ \alpha(i_1,\dots,i_k) = m_1! \cdots m_n! $$ where $m_j$ is the multiplicity of $j \in \{1,\dots,n\}$ in the tuple $(i_1,\dots,i_k)$.

Kadmos
  • 3,243
Ben Grossmann
  • 234,171
  • 12
  • 184
  • 355
  • 1
    I'm really sorry... I'm even ashamed... Thank you for your answer... The last part... regrouping... which sum are u regrouping? the sum for $T\hat$? – Arman Malekzadeh Mar 11 '17 at 18:23
  • 1
    Yes, it's the sum for $\tilde T$ (which we showed is also just a sum for $T$) – Ben Grossmann Mar 11 '17 at 18:23
  • 1
    Note: $S_k$ denotes the set of all permutations on $k$ elements – Ben Grossmann Mar 11 '17 at 18:27
  • 2
    Omnom, $\sum_{\sigma\in S_k} e^{\sigma(i_1)}\otimes\cdots\otimes e^{\sigma(i_k)}$ will have each distinct summand occur with multiplicity $m_1!\cdots m_n!$ where $m_j$ is how many $j$s appear in the multiset ${i_1,\cdots,i_k}$, so this should be normalized. – anon Mar 11 '17 at 18:36
  • 1
    @arctictern well spotted – Ben Grossmann Mar 11 '17 at 18:37
  • Why is the second equation true?(second part of first equation) you wrote $i_1$ instead of $i$ ... for example $v_{i_1}^{(1)}$ instead of $v_{i}^{(1)}$ – Arman Malekzadeh Mar 11 '17 at 18:40
  • @ArmanMalekzade because $i$ is simply an index of a sum. For example, it should be clear that $\sum_{i = 1}^{5} 2^i$ means exactly the same thing as $\sum_{i_1 = 1}^5 2^{i_1}$. The motivation for this step is to clarify the next step, where I note the linearity of $T$ over each of its $k$ arguments. – Ben Grossmann Mar 11 '17 at 18:43
  • @GK thanks for the suggestion – Ben Grossmann Sep 09 '18 at 02:39
  • @BenGrossmann Hi! , I am having a really hard time proving ${tilde } T =T$. Can you please write a proof of that ? I shall be really thankful. –  Apr 29 '22 at 07:54
  • @Avenger I'm not sure when I'd have the time for that. I recommend that you post a new question that includes a link to my answer and explains your difficulty. If you leave a link to that question as a comment here, I might be able to give it a try later if it hasn't already been answered. – Ben Grossmann Apr 29 '22 at 11:41
  • @BenGrossmann Thanks for letting me know. I understand. –  Apr 29 '22 at 11:44
  • @Kadmos Thank you for the correction – Ben Grossmann Oct 02 '24 at 16:27
6

Let's consider $\mathbb{R}^2$ with standard basis $e_1,e_2$.

If $T$ is a symmetric tensor $T:\mathbb{R}^2\times\mathbb{R}^2\times\mathbb{R}^2\to\mathbb{R}$, then we can group basis vectors $e^{i_1}\otimes e^{i_2}\otimes e^{i_3}$ of the tensor power $(\mathbb{R}^2)^{\otimes 3}$ according to whether or not $T$ must send them to the same value:

$$ \begin{array}{rrrr} e_1\otimes e_1\otimes e_1 \\ \hline e_1\otimes e_1\otimes e_2 & e_1\otimes e_2\otimes e_1 & e_2\otimes e_1\otimes e_1 \\ \hline e_1\otimes e_2\otimes e_2 & e_2\otimes e_1\otimes e_2 & e_2\otimes e_2\otimes e_1 \\ \hline e_2\otimes e_2\otimes e_2 \end{array} $$

In other words, the values $T$ takes on any tensor can be determined as long as we know what values $T$ takes on

  1. $e_1\otimes e_1\otimes e_1$,
  2. $e_1\otimes e_1\otimes e_2$,
  3. $e_1\otimes e_2\otimes e_2$,
  4. $e_2\otimes e_2\otimes e_2$.

These are precisely the basis elements $e_{i_1}\otimes e_{i_2}\otimes e_{i_3}$ with $i_1\le i_2\le i_3$.

Conversely, given any four values $a,b,c,d$ we can arrange for $T$ to take these values on the above basis vectors by writing out

$$ \begin{array}{lll} T & = & a(e^1\otimes e^1\otimes e^1) \\ &+ & b(e^1\otimes e^1\otimes e^2+e^1\otimes e^2\otimes e^1+e^2\otimes e^1\otimes e^1) \\ & + & c(e^1\otimes e^2\otimes e^2+e^2\otimes e^1\otimes e^2+e^2\otimes e^2\otimes e^1) \\ & + & d(e^2\otimes e^2\otimes e^2), \end{array} $$

Generalize.

anon
  • 155,259