I want to prove that
Any polynomial in $n$ variables over $\mathbb{R}$, is a linear combination of powers of "linear polynomials in $x_i$'s".
I've tried to do this by induction on $n$
Also note that it is enough to write every term of a certain degree (say $k$) can be written in above mentioned way.
Case 1: $n=2$
For $k$, consider $k+1$ polynomials $(x_1+ix_2)^k\text{ where }1\leq i\leq k+1$. Then since the Vander-monde determinant of the coefficient matrix is non-zero, we get that any term of the form $x_1^{k-m}x_2^m\text{ where }0\leq m\leq k$ can be written in prescribed way. Hence we are done in this way.
Now , assuming the case for $n=l$, I have to prove that the process continues to hold in $n=l+1$. I'm stuck here. Give me a hint.
P.S. PLEASE DO NOT POST A FULL SOLUTION. A HINT WILL SERVE THE PURPOSE!
EDIT: As people are not getting the question, let's give you an example
take $n=2$
then $xy=\frac{1}{2}((x+y)^2-x^2-y^2)$. Identities about $xy^2$, someone mentioned in the comment. So the question is how to write a polynomial of the form $x_1^{k_1}x_2^{k_2}\dots x_n^{k_n}=\sum (\sum a_{i(j)}x_{i(j)})^{l_j}$
you can take the coefficient vectors as vector in space $\mathbb{R}^{k+1}$, then you can consider $x_1^{k-m},x_2^m$ as the vector consisting of $0,1$'s with $1$ at $k-m+1$ th position and $0$ elsewhere and then we use linear independence.
– user300 Mar 23 '17 at 07:09