1

This is the same question as this: UMVUE of $E[X^2]$ where $X_i$ is Poisson $(\lambda)$. Here, I restate the problem for completeness:

Let $X_1, \ldots, X_n \overset{\text{i.i.d.}}{\sim} \mathrm{Pois}(\lambda)$, find the UMVUE of $\lambda^2+\lambda$, which is $\mathbb{E}[X_1^2]$.

While I think Chlarinetist's answer is correct, when I tried it myself, I get a different one and I'm not sure what went wrong. Consider $\phi(T) = \mathbb{E}[X_1^2 \mid T]$ where $T = \sum_{i=1}^{n} X_i$, a complete and sufficient statistic for $\lambda$. From Lehmann–Scheffé theorem we know that $\phi(T)$ will be an UMVUE of $\mathbb{E}[\phi(T)] = X_1^2 = \lambda^2 + \lambda$. It remains to calculate $\phi(T)$.

We see that $n\phi(t)$ is $$ n\phi(t) = \mathbb{E}\left[\sum_{i=1}^n X_i^2 \mid \sum_{i=1}^n X_i = t\right] = \mathbb{E}\left[\left(\sum_{i=1}^n X_i\right)^2 - 2 \sum_{i<j} X_i X_j \mid \sum_{i=1}^n X_i = t\right] = t^2 - 2\sum_{i<j} \mathbb{E}\left[ X_i X_j \mid \sum_{i=1}^n X_i = t\right]. $$ With independence, we can further break down the second term as $$ - 2\sum_{i<j} \mathbb{E}\left[ X_i X_j \mid \sum_{i=1}^n X_i = t\right] = - 2\sum_{i<j} \mathbb{E}\left[ X_i \mid \sum_{i=1}^n X_i = t\right]\mathbb{E}\left[ X_j \mid \sum_{i=1}^n X_i = t\right] = - 2 \binom{n}{2}\mathbb{E}\left[ X_1 \mid \sum_{i=1}^n X_i = t\right]^2. $$ Since $\mathbb{E}[ X_1 \mid \sum_{i=1}^n X_i = t]=t/n$ by a simple calculation, we finally have $$ n\phi(t) = t^2 - 2\binom{n}{2} \left(\frac{t}{n}\right)^2 = t^2 - \frac{n(n-1)t^2}{n^2} = \frac{t^2}{n}, $$ implying $\phi(t)=t^2/n^2$. That is, $\overline{X}^2$ is an UMVUE. It feels correct to me but I don't know where go wrong as I know the UMVUE should be unique.

pbb
  • 385
  • The error is that while the unconditional variables $X_i, X_j$ are independent, the conditioned variables $X_i|\sum_{k=1}^n X_k=t$, $X_j|\sum_{k=1}^n X_k=t$ are no longer independent. For example, in the case $n=2$, if $X_1+X_2=t$, then $X_2=t-X_1$, and the variables are perfectly negatively correlated. – user469053 Apr 12 '24 at 16:34
  • @user469053 I see. In this case, is there an easy fix to this? Like how to calculate $\phi(t)$ following my derivation? – pbb Apr 12 '24 at 16:40

1 Answers1

1

To calculate the value of $$\textrm{E}\left[X_{i}X_{j}\mid\sum_{i=1}^{n}X_{i}=t\right],$$ first note that $$X_{1},X_{2},\ldots,X_{n}\mid\sum_{i=1}^{n}X_{i}=t{\displaystyle \sim\textrm{Multinomial}\left(t,\dfrac{1}{n},\dfrac{1}{n},\ldots,\dfrac{1}{n}\right)}.$$ It then follows that $$\textrm{E}\left[X_{i}X_{j}\mid\sum_{i=1}^{n}X_{i}=t\right]$$ $$=\textrm{Cov}\left[X_{i},X_{j}\mid\sum_{i=1}^{n}X_{i}=t\right]+\textrm{E}\left[X_{i}\mid\sum_{i=1}^{n}X_{i}=t\right]\textrm{E}\left[X_{i}\mid\sum_{i=1}^{n}X_{i}=t\right]$$ $$=-\dfrac{t}{n^{2}}+\left(\dfrac{t}{n}\right)^{2}$$ $$=\dfrac{t\left(t-1\right)}{n^{2}}.$$

AOS
  • 301