4

Let $\theta >0$ be a parameter and let $X_1,X_2,\ldots,X_n$ be a random sample with pdf $f(x\mid\theta)=\frac{1}{3\theta}$ if $-\theta \leq x\leq 2\theta$ and $0$ otherwise.

a) Find the MLE of $\theta$

b) Is the MLE a sufficient statistic for $\theta$?

c) Is the MLE a complete statistic for $\theta$?

d) Is $\frac{n+1}{n}\cdot MLE$ the UMVUE of $\theta$?

I've been able to solve a). The MLE of $\theta$ is $\max(-X_{(1)},\frac{X_{(n)}}{2}).$ Also, you can show that it is sufficient using the Factorization Theorem.

However, I cannot solve the next questions I think because of the $\max$ in the MLE. Is there another way to express $\max(-X_{(1)},\frac{X_{(n)}}{2})$? Can I express the MLE as $\frac{|X|_{(n)}}{2}?$

StubbornAtom
  • 17,932

1 Answers1

4

Find the distribution of $T=\max\left(-X_{(1)},\frac{X_{(n)}}{2}\right)$.

For $0<t<\theta$, we have \begin{align} P_{\theta}(T\le t)&=P_{\theta}\left(-t\le X_{(1)},X_{(n)}\le 2t\right) \\&=P_{\theta}\left(-t\le X_1,X_2,\ldots,X_n\le 2t\right) \\&=\left\{P_{\theta}\left(-t<X_1<2t\right)\right\}^n \\&=\left(\frac{t}{\theta}\right)^n \end{align}

So $T$ has density

$$f_T(t)=\frac{nt^{n-1}}{\theta^n}\mathbf1_{0<t<\theta}$$

In other words, $T$ is distributed exactly as $Y_{(n)}$ where $Y_1,\ldots,Y_n$ are i.i.d $U(0,\theta)$ variables.

So studying the properties of $T$ as an estimator of $\theta$ reduces to studying the properties of $Y_{(n)}$.

That $T$ is a (minimal) complete statistic is proved in detail here. And by Lehmann-Scheffe theorem, $\left(\frac{n+1}{n}\right)T$ is indeed the UMVUE of $\theta$.

StubbornAtom
  • 17,932
  • 1
    Thank you. Is the MLE minimally sufficient too? How do you know if it is? – beginnermath Apr 15 '20 at 23:28
  • Yes it is minimal sufficient. All the properties of $T$ follow by studying the $U(0,\theta)$ case. – StubbornAtom Apr 16 '20 at 05:56
  • I have a follow up question, hopefully you can answer it: Why can't T take the values $0, \theta$ (you indicator function is excluding these values)? Why can't $Y_1, ..., Y_n$ be idd $U[0,\theta]?$ – beginnermath May 07 '20 at 18:16
  • This is a continuous distribution, so $T$ takes the end-points $0,\theta$ with zero probability. You can of course include these values in the indicator and write it as $U[0,\theta]$. From a measure-theoretic perspective, makes no difference. – StubbornAtom May 07 '20 at 18:23
  • 1
    There is something I am confused about. The MLE does not exist for $Y_1, ..., Y_n$ idd $U(0,\theta)$ (the likelihood function doesn't attain its supremum on $(0,\theta)$). And you said studying the properties of T as an estimator of $\theta$ reduces to studying the properties of $Y_{(n)},$ but $Y_{(n)}$ is not the MLE of $Y_1, ..., Y_n$ idd $U(0,\theta)$ but T is the MLE of $X_1, ..., X_n$ idd $U[-\theta, 2\theta]$ – beginnermath May 07 '20 at 18:29
  • 1
    For MLE, we choose the $U[0,\theta]$ version to circumvent that issue. Even if we take the $U(0,\theta)$ version, it makes no difference in practice as I mentioned in the last comment. – StubbornAtom May 07 '20 at 18:34
  • 1
    ok I see. But still, if you have $X_1, ..., X_n$ idd $U(0,\theta)$, so each $X_i$ cannot take $0, \theta$, there is no MLE for $\theta$? As in this would be an example where there does not exist an MLE. I was just studying and thinking about this. – beginnermath May 07 '20 at 18:40
  • 1
    Yes; strictly speaking, that is correct. – StubbornAtom May 07 '20 at 19:47