3

Suppose that $Y_1, \dots, Y_n$ are i.i.d observations from the density $f(y, \theta, \beta) = \beta e^{-\beta(y - \theta)}I_{[y \geq\theta]}$

where $\beta \gt 0$, $\theta \in \mathbb{R}$ are unknown parameters.

Let $(T_1, T_2) = (\min(Y_1, \dots, Y_n), \bar{Y})$. I want to show that $T_2 - T_1$ is independent of $T_1$ for all values of $(\theta, \beta)$. I already proved that $T_1$ is complete sufficient for $\theta$ when $\beta$ is fixed and known, and that $T_2$ is complete sufficient for $\beta$ when $\theta$ is fixed and known. Clearly, $T_2 - T_1$ is an ancillary statistic of $\theta$.

Since $T_1$ is complete sufficient for $\theta$ when $\beta$ is fixed and known, can we just use Basu's theorem to conclude that $T_2 - T_1$ is independent of $T_1$ for all values of $(\theta, \beta)$?

  • 1
    I don't see any problem in applying Basu's theorem. – zhoraster Jul 25 '23 at 06:19
  • 1
    That is correct. You have $T_2-T_1$ independent of $T_1$ for all $\theta$ when $\beta$ is fixed and known. But since $\beta$ is also arbitrary, the independence holds for every $(\theta,\beta)$. – StubbornAtom Jul 25 '23 at 12:58
  • 1
    An alternative is to say here each $Y_{(k)}-Y_{(1)}$ is independent of $Y_{(1)}$ for given $\beta$ due to the memoryless property of the exponential distribution and $T_2-T_1=\frac1n \sum Y_{(k)}-Y_{(1)}$ – Henry Jul 25 '23 at 18:14
  • Thanks for the comments. – Oscar24680 Jul 25 '23 at 21:27

0 Answers0