We say a function is $k$-linear if it takes $k$ values as input and is linear with respect to each of them. For example, determinant is a $n$-linear function. (If the matrix is $n \times n$)
A tensor is a function $T:V \times V\times V\times \dots\times V\to \mathbb R$ ($k$ vectors taken as input) such that $T$ is $k$-linear. (Its linear with respect to each of its $k$ inputs) ($V$ is a vector space)
A symmetric tensor is a tensor that is invariant under a permutation of its vector arguments. Meaning:
$T(v_1,v_2,\dots,v_r)=T(v_{\sigma(1)},v_{\sigma(2)},\dots,v_{\sigma(r)})$ For each permutation $\sigma$ of the symbols $\{1,2,\dots,r\}$.
We call $Sym^k(V)$ the vector space of all symmetric $k$-tensors on vector space $V$.
If $T$ is a $m$-tensor and $S$ is a $n$-tensor, then $T \otimes S$ is a $m+n$-tensor such that for each $(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})$ $T \otimes S(v_1,\dots,v_m,v_{m+1},\dots,v_{m+n})=T(v_1,\dots,v_m)S(v_{m+1},\dots,v_{m+n})$.
Now we want to find a basis for this vector space.
I know that the basis should consist of something related to the sum of tensor products of the elements of the basis of the dual space of $V$ (called $V^*$). But i can't see how. The complete Question is written below:
Let $V$ be an $n$-dimensional vector space. Compute the dimension of $Sym^k(V)$. (It can be also found here Page 33)
I know that every $k$-tensor can be written as a linear combination of $\{e^{i_1}\otimes e^{i_2}\otimes\dots\otimes e^{i_k}\}$ such that $1 \le i_1,\dots,i_k\le n$. But i don't know which members should be eliminated to form a basis for just the symmetric $k$-tensors (Not all $k$-tensors).
Note: For example, I know that if $k=3$, A member of the basis of $Sym^3(V)$ is $e^1\otimes e^2 \otimes e^1+e^2\otimes e^1 \otimes e^1+e^1\otimes e^1 \otimes e^2$. But i don't know why! I want explanation. I have the answer... The dimension of $Sym^k(V)$ is ${n+k-1} \choose k$. My problem is that i don't know why this sum and why the coefficient should be the same for those elements in the sum.