4

Let $V$ be a finite-dimensional vector space over a field $k \supseteq \mathbb Q$ (characterstic $0$). In Helgason's Groups and Geometric Analysis it is mentioned that the symmetric algebra $S(V)$ is linearly generated by the $v^m$ for $v \in V$ and $m \in \mathbb N$.

In the case $\dim V = 1$ this is trivial.

If $\dim V = 2$ and $(x, y)$ is a basis of $V$, then binomial expansion and the invertibility of a Vandermonde-matrix shows that the $(x+ ty)^m$ for $m+1$ different values of $t$ span the homogeneous elements of degree $m$.

How to prove this for $\dim V >2$?

There may be a generalization of a Vandermonde matrix that I don't know about.

Note: Helgason mentions this for $k = \mathbb R$ but I suspect it to be true more generally. By tensoring it suffices to do the case $k = \mathbb Q$.

Bart Michels
  • 26,985
  • 6
  • 59
  • 123

3 Answers3

5

Let $n$ be a nonnegative integer. Let $\left[n\right]$ denote the set $\left\{1,2,\ldots,n\right\}$.

A well-known identity (sometimes called the "polarization identity", but that name has many claimants) says the following:

Theorem 1. Let $v_1, v_2, \ldots, v_n$ be $n$ elements of a commutative ring $\mathbb{L}$. Then, \begin{equation} \sum_{I \subseteq \left[n\right]} \left(-1\right)^{n-\left|I\right|} \left(\sum_{i\in I} v_i\right)^n = n! \cdot v_1 v_2 \cdots v_n . \end{equation}

Theorem 1 is, e.g., Exercise 6.50 (d) in my Notes on the combinatorial fundamentals of algebra, version of 10 January 2019, where I also show a noncommutative version (Exercise 6.50 (c)). For the same proof in a slightly quicker writeup, see the solution to Exercise 6 (c) in UMN Fall 2018 Math 5705 homework #3 solutions (which shows the noncommutative version, from which Theorem 1 easily follows). The main idea of the proof is to expand the sum on the left hand side of the claim of Theorem 1, and check how often and with which coefficients a given monomial $v_1^{i_1} v_2^{i_2} \cdots v_n^{i_n}$ occurs in the expansion. If at least one of the numbers $i_1, i_2, \ldots, i_n$ is $0$ -- let's say $i_k = 0$ --, then this monomial occurs an even number of times, and the occurrences cancel each other out, because every occurrence in an addend for some subset $I$ can be contrasted with a different occurrence, with opposite sign, in the addend for the subset $I \cup \left\{k\right\}$ (if $k \notin I$) or the subset $I \setminus \left\{k\right\}$ (if $k \in I$). So the only surviving monomials are those where all $n$ numbers $i_1, i_2, \ldots, i_n$ are positive. But the only such monomial is $v_1 v_2 \cdots v_n$ (because degree considerations require $i_1 + i_2 + \cdots + i_n = n$), and it is easy to see that it appears only once (namely, in the addend for $I = \left[n\right]$) and its coefficient is $n!$. Thus Theorem 1 follows. I'm quite sure that Theorem 1 has been known since the 19th century. $\blacksquare$

Corollary 2. Let $\mathbb{K}$ be any commutative ring. Let $V$ be a $\mathbb{K}$-module. Let $v_1, v_2, \ldots, v_n$ be $n$ vectors in $V$. Then, \begin{equation} \sum_{I \subseteq \left[n\right]} \left(-1\right)^{n-\left|I\right|} \left(\sum_{i\in I} v_i\right)^n = n! \cdot v_1 v_2 \cdots v_n \end{equation} in the symmetric power $\operatorname{Sym}^n V$.

Proof of Corollary 2. The symmetric algebra $\operatorname{Sym} V$ of $V$ is a commutative ring, and the vectors $v_1, v_2, \ldots, v_n$ belong to this ring (if we identify $V$ with $\operatorname{Sym}^1 V \subseteq \operatorname{Sym} V$ in the obvious way). Hence, Theorem 1 (applied to $\mathbb{L} = \operatorname{Sym} V$) yields \begin{equation} \sum_{I \subseteq \left[n\right]} \left(-1\right)^{n-\left|I\right|} \left(\sum_{i\in I} v_i\right)^n = n! \cdot v_1 v_2 \cdots v_n . \end{equation} This is an equality in $\operatorname{Sym} V$, and thus an equality in $\operatorname{Sym}^n V$ (since both its sides belong to $\operatorname{Sym}^n V$). This proves Corollary 2. $\blacksquare$

Now, we get a generalization of your claim:

Corollary 3. Let $\mathbb{K}$ be any commutative ring in which $n!$ is invertible. Let $V$ be a $\mathbb{K}$-module. Then, the $\mathbb{K}$-module $\operatorname{Sym}^n V$ is spanned by the $v$-th powers of elements of $V$.

Proof of Corollary 3. We know that the $\mathbb{K}$-module $\operatorname{Sym}^n V$ is spanned by products of the form $v_1 v_2 \cdots v_n$ with $v_1, v_2, \ldots, v_n \in V$. Hence, it suffices to prove that every such product can be written as a $\mathbb{K}$-linear combination of $n$-th powers of elements of $V$. So let $v_1, v_2, \ldots, v_n \in V$ be arbitrary. Then, Corollary 2 yields \begin{equation} \sum_{I \subseteq \left[n\right]} \left(-1\right)^{n-\left|I\right|} \left(\sum_{i\in I} v_i\right)^n = n! \cdot v_1 v_2 \cdots v_n . \end{equation} Solving this equation for $v_1 v_2 \cdots v_n$, we find \begin{equation} v_1 v_2 \cdots v_n = \dfrac{1}{n!} \sum_{I \subseteq \left[n\right]} \left(-1\right)^{n-\left|I\right|} \left(\sum_{i\in I} v_i\right)^n . \end{equation} This is clearly a $\mathbb{K}$-linear combination of $n$-th powers of elements of $V$. Thus, Corollary 3 is proven. $\blacksquare$

2

Once we have the case $\dim V = 2$ we can proceed by induction on the number of factors in a monomial: by the $\dim V = 2$ case, we know that a product of two powers $x^ay^b$ is a linear combination of "pure" powers $v^m$ (with $m=a+b$). Thus a product of three powers $x^ay^bz^c$ is a power $z^c$ times a linear combination of powers $v^m$, i.e. a linear combination of a product of two powers $v^mz^c$. And so on, by induction we reduce the case of a product of $n$ powers to a product of $n-1$ powers.

This also works for infinite-dimensional $V$ (the induction is not on the dimension).

Bart Michels
  • 26,985
  • 6
  • 59
  • 123
0

Another proof of this fact for a finite-dimensional vector space over $\mathbb K = \mathbb R$ can be found in Stein's Harmonic analysis: real-variable methods, orthogonality, and oscillatory integrals, §VIII: 2.2.1 at the bottom of page 343. He considers a positive definite inner product on homogeneous polynomials in $n$ variables: $$\langle P, Q \rangle = [Q(\partial/\partial x_1, \ldots, \partial/\partial x_n)](P)$$ (Note $\langle P, P \rangle = \sum_\alpha \alpha! a_\alpha^2$ when $P = \sum a_\alpha x^\alpha$.) If $P$ were orthogonal to all $n$th powers, then $$(\xi \cdot \nabla)^n(P) = 0$$ for all $\xi \in \mathbb R^n$. That is, $$ \left( \frac{d}{dt} \right)^n P(t \xi) = 0$$ for all $\xi \in \mathbb R^n$. That is, the restriction of $P$ to every line is of degree less than $n$. But that restriction is homogeneous of degree $n$, so $P$ is zero on every line.


In retrospect, this proof does not use analysis and also works for $\mathbb K = \mathbb Q$.

Bart Michels
  • 26,985
  • 6
  • 59
  • 123