2

Let $X_1, \ldots,X_n$ be iid from an exponential distribution with density

$$ f(x)=\begin{cases}\theta e^{-\theta x},&x>0, \\ 0,&\text{otherwise}.\end{cases}$$ Find UMVU estimators for $\theta$ and $\theta^2$.

$$f(\textbf x\mid\theta)=\displaystyle \prod\theta e^{-\theta x_i}1 \{x_i>0\} = \underbrace{\theta^ne^{-\theta\sum x_i}} \underbrace{1\{x_{(1)}>0\}}$$

so $\sum x_i$ is sufficient. But I do not know how to show it is complete. If this is the right way to find UMVU estimators, could someone teach me how to show if $\sum x_i$ is complete? And if it isn't how do you find an UMVU estimator?

Vons
  • 11,285
  • 2
    Completeness follows from the fact that $f$ is a member of a regular exponential family. https://math.stackexchange.com/q/2034206/321264 – StubbornAtom Jan 25 '21 at 06:12
  • I read this. So, Binomial with $T_1=\sum_i X_i$, negative binomial with $T_1$, geometric with $T_1$, Poisson $T_1$,exponential with $T_1$,normal distributions with $(\overline{X},\sum(X_i-\overline{X})^2)$ are all sufficient and complete? @StubbornAtom – DSR Jan 27 '21 at 05:47
  • @Daman In the usual case, yes. – StubbornAtom Jan 27 '21 at 06:25

1 Answers1

2

As you surely know, Negative Exponential distribution belongs to the Exponential Family thus $S=\Sigma_i X_i$ is CSS (Complete, minimal and Sufficient Statistic).

The UMVUE of $\theta$ has been discussed many times, one of them clearly linked in the above comment, so let's see how to find the UMVUE of $g(\theta)=\theta^2$

To find the UMVUE of $g(\theta)$ you can use Lehmann Scheffé Lemma by which, to find it it is enough to find an unbiased estimator, function of a CSS.

Let's set

$$T=\widehat{\theta^2}=\frac{1}{(\sum_i X_i)^2}$$

and set $Y=\sum_i X_i$. As known,

$$Y\sim \operatorname{Gamma}(n;\theta)$$

Thus

$$\mathbb{E}[T]=\int_0^\infty \frac{1}{y^2}\frac{\theta^n}{\Gamma(n)} y^{n-1}e^{-\theta y} \, dy = \frac{\theta^2}{(n-1)(n-2)} \underbrace{ \int_0^\infty \frac{\theta^{n-2}}{\Gamma(n-2)}y^{(n-2)-1}e^{-\theta y} \, dy}_{=1}=\frac{\theta^2}{(n-1)(n-2)}$$

The estimator, function of $S,$ is BIASED but with a correctable bias...

I think you can proceed and easy conclude...

tommik
  • 33,201
  • 4
  • 17
  • 35
  • Thank you sir for providing a complete answer. I notice the same technique is used here as in the linked post. – Vons Jan 25 '21 at 19:31