17

[A recent post reminded me of this.]

How can we fill in the blanks here:

For any _____ function $f(x,y,z)$ of three variables that is symmetric in the three variables, there is a _____ function $\varphi(u,v,w)$ of three variables such that $f(x,y,z) = \varphi(x+y+z, xy+yz+zx, xyz)$. [Of course we can ask this for some number of variables other than three.]

For example, the theorem (polynomial, polynomial) is classical, and the theorem (rational, rational) is linked above. What others work? (algebraic, algebraic), say, or (elementary, elementary) or (continuous, continuous) or ($C^\infty, C^\infty$).

Is there an elementary function $\varphi(u,v,w)$ of three complex variables such that $e^x+e^y+e^z = \varphi(x+y+z, xy+yz+zx, xyz)$ for all $x,y,z, \in \mathbb C$?

Even if $x, y, z$ separately are not differentiable functions of $(x+y+z, xy+yz+zx, xyz)$, could it be that $e^x+e^y+e^z$ is?

RobPratt
  • 50,938
GEdgar
  • 117,296

2 Answers2

5

We'll consider first symmetric functions of the form $\sum_{i=1}^n \phi(x_i)$.

An important example is $\phi(x) = \log(1+ x t)$. Then $$\sum \phi(x_i) = \sum \log (1+ x_i t) = \log(\prod_i (1 + x_i t) )= \log (1 + s_1 t + s_2 t^2 + \cdot + s_n t^n)$$

Now consider both sides of the $\log $ equality as formal power series in $t$ and expand. Since $$\log (1 + u) = u - \frac{u^2}{2} + \frac{u^3}{3} - \cdots $$ we get on LHS ( $h_k = \sum_i x_i^k$) $$\sum \log (1+ x_i t) = \sum_{k\ge 1} (-1)^{k-1} \frac{h_k}{k} t^k$$ while on RHS we get $$\sum_{k\ge 1}(-1)^{k-1} \frac{(s_1 t+ s_2 t^2 + \cdots + s_n t^n)^k}{k}$$

Matching the coefficients of $t^k$ we get $$(-1)^{k-1}h_k/k = \sum_{u_1 + 2 u_2 + \cdot + n u_n = k}(-1)^{u_1 + u_2 + \cdots + u_n -1} \frac{(u_1 + u_2 + \cdot + u_n-1)!}{\prod u_i !}s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n} $$ so $$h_k = \sum_{u_1 + 2 u_2 + \cdot + n u_n = k}(-1)^{u_2 + u_4 + \cdots }\ \frac{k(u_1 + u_2 + \cdot + u_n-1)!}{\prod u_i !}s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n} $$

This is Waring's formula indicated by Jyrki.

Now consider the case $\phi(x) = \exp x$ . We get $$\sum \phi(x_i) = n + \sum_{k\ge 1} \frac{h_k}{k!} = n + \sum_{u_1+ \cdots+u_n>0} (-1)^{u_2 + u_4 + \cdots} \frac{(u_1 + u_2 + \cdots + u_n-1)!}{(u_1 + 2 u_2 + \cdots + n u_n-1)!} \frac{s_1^{u_1}}{u_1 !} \cdots \frac{s_n^{u_n}}{u_n!}$$

We have on the RHS the function of $s_1$, $\ldots$, $s_n$. It is entire, but does not appear to be elementary. Perhaps it is some sort of hypergeometric function. We leave it at that.

We can tackle the general case of an entire function of $n$ variables that is symmetric, using some analysis.

Let's see what happens for a function of the form $\sum_{i=1}^n \phi(x_i)$, where $\phi$ is an entire function. We have $$\phi(x) = \sum_{k\ge 0} a_k x^k$$ for all $x\in \mathbb{C}$, where $a_k$ is a fast decreasing sequence, that is, for every $M>0$ the sequence $M^k a_k$ is bounded ( converges to $0$). Like before, we get $$\sum_{i=1}^n \phi(x_i) = n a_0 + \sum_{k\ge 1} a_k h_k= \\ =n a_0+ \sum_{u_1+ \cdots + u_n>0 } b_{(u_1, \ldots u_n)} s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n}$$ where $$b_{(u_1, \ldots u_n)}=(-1)^{\sum u_{2l}} \frac{(u_1 + 2 u_2 + \cdots)(u_1 + u_2 + \cdots + u_n-1)!}{u_1 ! u_2! \ldots u_n!} a_{u_1 + 2 u_2 + \cdots}$$

Now to check that the function in $s_1$,$\ldots$, $s_n$ is entire we need to see that for every $M>0$ the set $$M^{u_1 + \cdots + u_n} b_{(u_1, \ldots, u_n)}$$ is bounded , which is not hard to check.

$\bf{Added:}$

Let $\phi(x_1, \ldots, x_n)$ an entire function that is symmetric. Write $\phi(x) = \sum_{k\ge 0} p_k(x)$in where $p_k(x)$ are symmetric polynomials of degree $k$. By the theorem for symmetric polynonials we have $p_k(x) =q_k(s_1, \ldots, s_n)$ where $q_k$ is a weighted homogenous polynomial of degree $k$ in $s_1$, $\ldots$, $s_n$. Hence we have $$\phi(x) = \sum_{k\ge 0} q_k(s)$$ Now the series $\sum_{k\ge 0} q_k(s)$ as function of $s$ converges uniformly on compact subsets of $\mathbb{C}^n$. The reason is that the map $(x_1, \ldots, x_n) \to (s_1, \ldots, s_n)$ is surjective and proper. It follows that $\sum_{k} q_k$ converges uniformly on compacts to an entire function of $s$, call it $\psi$. Moreover, we also have a uniform convergence on compacts of partial derivatives. Therefore, the polynomials $q_k$ are parts of the Taylor series of $\psi$. Hence, we can open the brackets in the expansion $\sum q_k(s)$ to get the Taylor series of $\psi(s)$.

orangeskid
  • 56,630
  • 1
    This is great! I tried your formula for $\exp(x_1)+\exp(x_2)$ in a numerical example. It came out correct. – GEdgar Apr 24 '20 at 14:04
  • 1
    @GEdgar: I am glad that it works OK. It should also work with other $\sum \phi(x_i)$ per Waring's formula. The result its true in the continuous, holomorphic, real analytic, and smooth case ( and this seems to be the order of difficulty). I will add some notes about some of them. Great question! – orangeskid Apr 24 '20 at 23:18
  • +1 I see Waring's formula working for sums of the form $f(x)+f(y)+f(z)$ with an entire $f$. That is, provided there are no problems with the convergence of the multivariable power series? Does the termwise approach cover all the analytic symmetric functions? – Jyrki Lahtonen Apr 26 '20 at 17:26
  • @Jyrki Lahtonen: Thank you!. The convergence of the series in the case of $\sum f(x_k)$ is easy. Now in the case of a general symmetric entire function I cannot handle it with power series, don't have explicit expressions for symmetric polynomials, have some alternate ways. To be continued. – orangeskid Apr 27 '20 at 02:01
0

Let $f$ be a symmetric continuous function in $n$ variables. These variables can be thought of as roots of a polynomial in $\mathbb{C}[x]$. From complex analysis, we know that each root depends continuously on the coefficients of a unique complex monic polynomial whose subleading coefficients are (up to a sign) elementary symmetric polynomials in the roots. So (continuous, continuous) holds.

So, in the example $f(x,y,z)=e^x + e^y + e^z$ above, simply replace $x,y,z$ with each of the three parts of the cubic formula and set the leading coefficient term to $1$.

However, since the roots of a polynomial do not depend differentiably on the coefficients, the claim may not hold for $C^1$ symmetric functions always.

Anz
  • 144