0

I am having some difficulty finding the expected value of the MLE for the $\operatorname{Beta}(\theta,1)$ distribution.

Until now, I have found that the MLE for $\theta$ is: $$\hat{\theta} = -\frac{n}{\sum_{i=1}^n \ln X_i}$$

And I know that the expected value of $\hat{\theta}$ is: $$E[\hat{\theta}]=\frac{n}{n-1}\theta$$

But I really don't know how to calculate this expected value. I need to find the distribution of $\hat{\theta}$ to be able to calculate this expected value? Or exists another approach to this problem.

Furthermore, if I know that $T(X_1,...,X_n)=\sum_{i=1}^n\ln X_i$ is sufficient statistics for $\theta$, how can I find a non-biased estimator which is a function of the sufficient statistics for $\theta$? There is a method for that?

Thank you very much!

Jackaba
  • 93

1 Answers1

2

But I really don't know how to calculate this expected value

Observe that

$$Y=-\log X \sim\exp(\theta)$$

Thus $\Sigma_y Y_i\sim \text{Gamma}(n,\theta)$

Is this enough for you to derive your UMVU Estimator?


Here is how to calculate your expectation using gamma distribution

$$\mathbb{E}[\hat{\theta}]=n\int_0^{\infty}\frac{1}{y}\frac{\theta^n}{\Gamma(n)}y^{n-1}e^{-n\theta}dy=n\theta\frac{\Gamma(n-1)}{\Gamma(n)}\underbrace{\int_0^{\infty}\frac{\theta^{n-1}}{\Gamma(n-1)}y^{(n-1)-1}e^{-n\theta}dy}_{=1}=$$

$$=\frac{n}{n-1}\theta$$

tommik
  • 33,201
  • 4
  • 17
  • 35
  • I understand that. But then, if $U=\sum_{i=1}^n Y_i \sim \text{Gamma}(n,\theta)$, to calculate the expected value of $\hat{\theta}$ I will need to find the distribution of $V=\frac{1}{U}$. Correct? – Jackaba Mar 11 '21 at 11:57
  • 1
    @PedroCrispim : you have two ways: the first is using the gamma distribution and the second is to observe that $V$ follows an Inverse gamma...what do you prefer? With the inverse gamma you have immediately the known result but also with the integral og gamma function it is easy... – tommik Mar 11 '21 at 11:58
  • I think I would prefer to use gamma distribution, but it's up to to you to decide the best way. – Jackaba Mar 11 '21 at 12:02
  • 1
    @PedroCrispim I added all the calculation using Gamma distribution – tommik Mar 11 '21 at 12:05
  • I really appreciate that!! And I don't want to bother you, but if you could just give me a hint for the second part of my question it would be great. – Jackaba Mar 11 '21 at 12:08
  • 1
    @PedroCrispim: Correcting the bias of your estimator you get that

    $$T^*=\frac{n-1}{-\Sigma_i \log X_i}$$

    is unbiased for $\theta$ and function of $T=\Sigma_i \log X_i$, Complete and Sufficient. Now you are done because you can apply Lehmann - Scheffé

    – tommik Mar 11 '21 at 12:32
  • Thank you very much!!! – Jackaba Mar 11 '21 at 12:35