33

I've got a seemingly simple question that I've become curious about as a result of supervising some undergraduate research. Let's suppose we have some sequence of polynomials $f_0, f_1, f_2, \cdots \in \mathbb{Z}[X]$, where $f_0=1$. Now, define the following sequence of symmetric polynomials on variables $x_1, \ldots x_n$:

$$P_m(x_1, \ldots, x_n)=\sum\limits_{m_1+\cdots+m_n=m}f_{m_1}(x_1) \cdots f_{m_n}(x_n)$$

If you want, you can think of this as the coefficient of $y^m$ in the two-variable generating function $\prod\limits_{i=1}^{n}\left(\sum\limits_{j \ge 0}f_j(x_i)y^j\right)$. Now my question is, if you know $x_1, \ldots, x_n$ are positive integers, and you know the values of $P_m(x_1, \ldots, x_n)$ for $m=0, 1, \ldots$, what conditions need to be true on $f_1, f_2, \ldots$ in order for the values of of $x_1, \ldots, x_n$ to be completely determined (up to ordering)? I think it's key that we need to work with inputs in $\mathbb{Z}^+$ - if we work over $\mathbb{C}$, the answer may be different (see the edit below).

For example, if $f_1(x)=x$, and $f_2=f_3=\cdots=0$, then $P_m$ is just the $m$-th elementary symmetric polynomial $\sigma_m$, and then obviously, $x_1, \ldots, x_n$ are determined by $\sigma_1, \sigma_2, \ldots$, being the roots of the polynomial with coefficients $(-1)^i\sigma_i$. If $f_i(x)=x^i$ for each $i$, then it's a little less straightforward, but still not hard: $P_m=\sum\limits_{j=1}^{m}(-1)^{j-1}\sigma_j P_{m-j}$, and so we can show by induction on $m$ that $P_1, \ldots, P_m$ together determine $\sigma_1, \ldots, \sigma_m$, and hence, $x_1, \ldots, x_n$ are again determined. For the general case, we can equivalently ask whether the values of $P_1, P_2, \ldots$ uniquely determine the values of $\sigma_1, \sigma_2, \ldots$ (and this is perhaps a more natural question).

As long as the polynomials $f_1$ aren't all constants, I don't, off the top of my head, know any sequences of polynomials $f_1, f_2, \ldots$ for which the values of $P_1, P_2, \ldots$ don't determine $x_1, \ldots, x_n$. So I'm wondering if it's true given that at least one of the $f_i$'s is nontrivial.

EDIT: I feel it's worth pointing out that what I'm asking isn't the same thing as asking that the $P_i$ generate the ring of symmetric polynomials. For example, if the values of the $P_i$ were to determine the values of $\sigma_1^2, \sigma_2^2, \ldots$, that would determine the values of $\sigma_1, \sigma_2, \ldots$, even if the $P_i$ don't generate the ring. This is why the condition that the $x_i$'s are positive integers is important. That said, I don't know the answer over $\mathbb{C}$ either.

  • 1
    Let $f_1(x)$ be a polynomial that maps two positive integers (say $f(x) = (3 - x)^2$ for $1$ and $2$) to the same value, and $f_2 = f_3 = \ldots = 0$. Then it seems to me that if $n = 1$, knowing $P_m(x_1) = f_1(x_1) 1_{m = 1}$ wouldn't be able to distinguish $x_1 \in {1, 2}$? Same example probably works for $n > 1$ as well. – John Jiang Dec 30 '18 at 23:50

1 Answers1

1

I am not sure that it will help you, but I would ask the question in these terms.

It seems to me that if $f_i(x) = x^i$, then your $P_m$ are the complete homogeneous symmetric polynomial $h_m$. I know that you have a proof in this case, but it is also worth noting that you have a generating function $$H(X,t) = \Pi_{x\in X} \frac{1}{1-x t} = \sum_m h_m(X) t^m$$ (equation 2.5 of the book "Symmetric functions and Hall Polynomials" of I.G. Macdonald) where $X = {x_1,... ,x_n}$ is your alphabet. I will use $x_i$ to talk about formal variables and $a_i$ to talk about the solutions that you are looking for. Note that there might also be a way to use the Cauchy identity by introducing the roots of your functions $f_j$ to get your generating function.

The operation of assigning values $(a_1,...a_n)$ to your alphabet is a substitution, as defined by Richard P. Stanley in the chapter 7.8 of the second volume of his book Enumerative Combinatorics. This is a morphism of algebra $\phi_{(a_1,...a_n)} : \Lambda_n \longrightarrow \mathbb{R}$ defined by $\phi_{(a_1,...a_n)}(\psi) = \psi(a_1,..., a_n)$. Your values are then the $(\phi_{(a_1,...a_n)}(P_i))_{i\in\mathbb{N}}$.

I think that you can remark that knowing the value of $a_1,...,a_n$ allows you to compute this $\phi$ for every symmetric function.

I would decompose you substitution as the composition of two operations. We consider a subalgebra $A$ of $\Lambda_n$. We define $\Phi^A : \mathbb{C}^n \longrightarrow \mathrm{Hom}(A, \mathbb{C})$ as $\Phi^A(a_1,...a_n)$ = $\phi_{(a_1,...a_n)}$. Your values are then the $(\Phi^A (a_1...a_n)(P_i))_{i\in\mathbb{N}}$ if the $P_i$ are in $A$.

First of all, for what subalgebra $A$ is the function $\Phi^A$ injective ? You should think of your subalgebra $A$ as generated by your $P_m$. I think that it is sufficient that $A$ = $\Lambda_n$, as you then have that you can express the the $h_m$ using your $P_m$, and you know how to retrieve the $a_i$ with the evaluation on your $h_m$.

You can also introduce the $\tilde{P_i} : \mathrm{Hom}(\Lambda_n, \mathbb{C}) \longrightarrow \mathbb{C}$ such that $\tilde{P_i}(\phi) = \phi(P_i)$. If you introduce the function $\tilde{P} : \mathrm{Hom}(\Lambda_n, \mathbb{C}) \longrightarrow \mathbb{C}^\mathbb{N}$ such that $\tilde{P}(\phi) = (\tilde{P_i}(\phi))_{i\in\mathbb{N}}$, your evaluation is then $\tilde{P} \circ \Phi^{\Lambda_n}$. You now want this function to be injective. I dont know if $\Phi^{\Lambda_n}$ is surjective - it probably is. If such is the case, then it is necessary and sufficient that $\tilde{P}$ is injective. You are then probably asking for a base of the dual of $\mathrm{Hom}(\Lambda_n, \mathbb{C})$ or something.

Once this is answered, you may ask: what kind of functions $f_i$ generates the right $P_m$ ?

I don't think that the $a_i$ being integers is really relevant here.

I hope that I have not made any mistake and that these ideas will help you a little bit. The books that I have cited can also be useful.