3

Let $X_1,\ldots,X_n$ be a random sample from a pdf $f_{\theta}(x) = \begin{cases} \theta e^{-\theta x}, & x>0 \\ 0, & \text{otherwise} \end{cases}$, where $\theta>0$ is an unknown parameter.

Then, the uniform minimum variance unbiased estimator for $\dfrac{1}{\theta}$ is

(A)$\dfrac{1}{\bar{X_n}}$

(B) $\displaystyle\sum_{i=1}^{n}X_i$

(C) $\bar{X_n}$

(D) $\dfrac{1}{\displaystyle\sum_{i=1}^{n}X_i}$

MY STEPS:

Taking the Expectation, $E_\theta(X)=\displaystyle\int_{0}^{\infty}xf(x)\;dx$

$$E_\theta(X)=\theta\int_0^\infty x e^{-\theta x}\;dx=\dfrac{1}{\theta}=\bar{X_n}$$

Hence, option (C) should be correct.

Did I solve this correctly ? Please help me confirm my solution.

  • 1
    If you had writte $\dfrac 1 \theta = \operatorname{E}(\bar X_n)$ rather than $\dfrac 1 \theta = \bar X_n$, then your last equality would be correct. ${}\qquad{}$ – Michael Hardy Mar 25 '15 at 02:09
  • 1
    $\ldots$ and you should have $\operatorname{E}\theta(X)$, not $\operatorname{E}\theta(x)$, and I changed it. ${}\qquad{}$ – Michael Hardy Mar 25 '15 at 02:37
  • @Michael, could you please help me on https://stats.stackexchange.com/questions/143141/most-powerful-test-and-rejection-region-of-gamma-distribution – Stuck in a JAM Mar 25 '15 at 02:39
  • A correct procedure is shown in this answer where you need to replace $\theta$ by $1/\theta$. – StubbornAtom May 25 '20 at 19:47

1 Answers1

2

Yes, correct procedure and answer.

Some background: This is the exponential distribution and it is 'parameterized' by the rate (here $\theta$, perhaps most often $\lambda$). It can also be parameterized by its mean as $f_\mu(x) = (1/\mu) e^{-x/\mu}$, for $x > 0$.

In parameter estimation, that gives rise to which estimators are unbiased for which parameters. You have just shown that $\bar X$ is unbiased for parameter $\mu = 1/\theta$. It is also UMVUE for $\mu$ because it is not only unbiased, but also based on the sufficient statistic.

However, $1/\bar X$ is not UMVUE for $\theta$. It is based on the sufficient statistic, but you can show that it is biased. That is $E(1/\bar X) \ne \theta.$ One says that 'expectation is a linear operator', and that 'unbiasedness does not survive nonlinear transformations' (such as taking the reciprocal).

See comments by @Michael Hardy and @heropup for important clarifications.

BruceET
  • 52,418
  • 2
    Not only sufficiency, but completeness also matters here. It is based on a complete sufficient statistic, i.e. a sufficient statistics admitting no unbiased estimators of zero. ${}\qquad{}$ – Michael Hardy Mar 25 '15 at 02:07
  • 3
    Actually, although the answer is correct, the solution does not show that $\bar X_n$ is unbiased for $1/\theta$, nor does it show that it is UMVUE. It simply shows that the expectation of a single observation is $1/\theta$. From here, we could show that such a choice of estimator attains the Cramer-Rao lower bound, thereby proving that it has minimum variance among all unbiased estimators. – heropup Mar 25 '15 at 02:10
  • yes.. I did see that one has to start with sufficient and complete statistics .. I need to practice more on this .. Thanks – Stuck in a JAM Mar 25 '15 at 02:13
  • 1
    Thanks to Michael Hardy and @heropup for finishing my answer! I was stuck "mutiple-choice mode." – BruceET Mar 25 '15 at 02:18
  • could you please help me on another problem .. https://stats.stackexchange.com/questions/143141/most-powerful-test-and-rejection-region-of-gamma-distribution – Stuck in a JAM Mar 25 '15 at 02:20
  • @StuckinaJAM Do you know how to show complete sufficiency here? – dsaxton Jun 29 '15 at 16:30