This is the same question as this: UMVUE of $E[X^2]$ where $X_i$ is Poisson $(\lambda)$. Here, I restate the problem for completeness:
Let $X_1, \ldots, X_n \overset{\text{i.i.d.}}{\sim} \mathrm{Pois}(\lambda)$, find the UMVUE of $\lambda^2+\lambda$, which is $\mathbb{E}[X_1^2]$.
While I think Chlarinetist's answer is correct, when I tried it myself, I get a different one and I'm not sure what went wrong. Consider $\phi(T) = \mathbb{E}[X_1^2 \mid T]$ where $T = \sum_{i=1}^{n} X_i$, a complete and sufficient statistic for $\lambda$. From Lehmann–Scheffé theorem we know that $\phi(T)$ will be an UMVUE of $\mathbb{E}[\phi(T)] = X_1^2 = \lambda^2 + \lambda$. It remains to calculate $\phi(T)$.
We see that $n\phi(t)$ is $$ n\phi(t) = \mathbb{E}\left[\sum_{i=1}^n X_i^2 \mid \sum_{i=1}^n X_i = t\right] = \mathbb{E}\left[\left(\sum_{i=1}^n X_i\right)^2 - 2 \sum_{i<j} X_i X_j \mid \sum_{i=1}^n X_i = t\right] = t^2 - 2\sum_{i<j} \mathbb{E}\left[ X_i X_j \mid \sum_{i=1}^n X_i = t\right]. $$ With independence, we can further break down the second term as $$ - 2\sum_{i<j} \mathbb{E}\left[ X_i X_j \mid \sum_{i=1}^n X_i = t\right] = - 2\sum_{i<j} \mathbb{E}\left[ X_i \mid \sum_{i=1}^n X_i = t\right]\mathbb{E}\left[ X_j \mid \sum_{i=1}^n X_i = t\right] = - 2 \binom{n}{2}\mathbb{E}\left[ X_1 \mid \sum_{i=1}^n X_i = t\right]^2. $$ Since $\mathbb{E}[ X_1 \mid \sum_{i=1}^n X_i = t]=t/n$ by a simple calculation, we finally have $$ n\phi(t) = t^2 - 2\binom{n}{2} \left(\frac{t}{n}\right)^2 = t^2 - \frac{n(n-1)t^2}{n^2} = \frac{t^2}{n}, $$ implying $\phi(t)=t^2/n^2$. That is, $\overline{X}^2$ is an UMVUE. It feels correct to me but I don't know where go wrong as I know the UMVUE should be unique.