We'll consider first symmetric functions of the form $\sum_{i=1}^n \phi(x_i)$.
An important example is $\phi(x) = \log(1+ x t)$. Then
$$\sum \phi(x_i) = \sum \log (1+ x_i t) = \log(\prod_i (1 + x_i t) )= \log (1 + s_1 t + s_2 t^2 + \cdot + s_n t^n)$$
Now consider both sides of the $\log $ equality as formal power series in $t$ and expand. Since
$$\log (1 + u) = u - \frac{u^2}{2} + \frac{u^3}{3} - \cdots $$ we get
on LHS ( $h_k = \sum_i x_i^k$)
$$\sum \log (1+ x_i t) = \sum_{k\ge 1} (-1)^{k-1} \frac{h_k}{k} t^k$$
while on RHS we get
$$\sum_{k\ge 1}(-1)^{k-1} \frac{(s_1 t+ s_2 t^2 + \cdots + s_n t^n)^k}{k}$$
Matching the coefficients of $t^k$ we get
$$(-1)^{k-1}h_k/k = \sum_{u_1 + 2 u_2 + \cdot + n u_n = k}(-1)^{u_1 + u_2 + \cdots + u_n -1} \frac{(u_1 + u_2 + \cdot + u_n-1)!}{\prod u_i !}s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n} $$
so
$$h_k = \sum_{u_1 + 2 u_2 + \cdot + n u_n = k}(-1)^{u_2 + u_4 + \cdots }\ \frac{k(u_1 + u_2 + \cdot + u_n-1)!}{\prod u_i !}s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n} $$
This is Waring's formula indicated by Jyrki.
Now consider the case $\phi(x) = \exp x$ . We get
$$\sum \phi(x_i) = n + \sum_{k\ge 1} \frac{h_k}{k!} = n + \sum_{u_1+ \cdots+u_n>0} (-1)^{u_2 + u_4 + \cdots} \frac{(u_1 + u_2 + \cdots + u_n-1)!}{(u_1 + 2 u_2 + \cdots + n u_n-1)!} \frac{s_1^{u_1}}{u_1 !} \cdots \frac{s_n^{u_n}}{u_n!}$$
We have on the RHS the function of $s_1$, $\ldots$, $s_n$. It is entire, but does not appear to be elementary. Perhaps it is some sort of hypergeometric function. We leave it at that.
We can tackle the general case of an entire function of $n$ variables that is symmetric, using some analysis.
Let's see what happens for a function of the form $\sum_{i=1}^n \phi(x_i)$, where $\phi$ is an entire function. We have
$$\phi(x) = \sum_{k\ge 0} a_k x^k$$
for all $x\in \mathbb{C}$, where $a_k$ is a fast decreasing sequence, that is, for every $M>0$ the sequence $M^k a_k$ is bounded ( converges to $0$). Like before, we get
$$\sum_{i=1}^n \phi(x_i) = n a_0 + \sum_{k\ge 1} a_k h_k= \\
=n a_0+ \sum_{u_1+ \cdots + u_n>0 } b_{(u_1, \ldots u_n)} s_1^{u_1} s_2^{u_2} \ldots s_n^{u_n}$$
where
$$b_{(u_1, \ldots u_n)}=(-1)^{\sum u_{2l}} \frac{(u_1 + 2 u_2 + \cdots)(u_1 + u_2 + \cdots + u_n-1)!}{u_1 ! u_2! \ldots u_n!} a_{u_1 + 2 u_2 + \cdots}$$
Now to check that the function in $s_1$,$\ldots$, $s_n$ is entire we need to see that for every $M>0$ the set
$$M^{u_1 + \cdots + u_n} b_{(u_1, \ldots, u_n)}$$ is bounded , which is not hard to check.
$\bf{Added:}$
Let $\phi(x_1, \ldots, x_n)$ an entire function that is symmetric. Write $\phi(x) = \sum_{k\ge 0} p_k(x)$in where $p_k(x)$ are symmetric
polynomials of degree $k$. By the theorem for symmetric polynonials we have $p_k(x) =q_k(s_1, \ldots, s_n)$ where $q_k$ is a weighted
homogenous polynomial of degree $k$ in $s_1$, $\ldots$, $s_n$. Hence we have
$$\phi(x) = \sum_{k\ge 0} q_k(s)$$
Now the series $\sum_{k\ge 0} q_k(s)$ as function of $s$ converges uniformly on compact subsets of $\mathbb{C}^n$. The reason is that the
map $(x_1, \ldots, x_n) \to (s_1, \ldots, s_n)$ is surjective and proper. It follows that
$\sum_{k} q_k$ converges uniformly on compacts to an entire function of $s$, call it $\psi$. Moreover, we also have a uniform convergence on compacts
of partial derivatives. Therefore, the polynomials $q_k$ are parts of the Taylor series of $\psi$. Hence, we can open the brackets
in the expansion $\sum q_k(s)$ to get the Taylor series of $\psi(s)$.