11

Although the setting of this question is statistics, the question actually asks for a real analysis fact (monotone functions).

Karlin-Rubin's theorem states conditions under which we can find a uniformly most powerful test (UMPT) for a statistical hypothesis:

Suppose a family of density or mass functions $\{f(\vec{x}|\theta):\,\theta\in\Theta\}$ and we want to test $$\begin{cases} H_0:\,\theta\leq\theta_0 \\ H_A:\,\theta>\theta_0.\end{cases}$$If the likelihood ratio is monotone on a statistic $T(\vec{x})$ (that is, for every fixed $\theta_1<\theta_2$ in $\Theta$, the ratio $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}$ is nondecreasing on $\{\vec{x}:\,f(\vec{x}|\theta_2)>0\text{ or }f(\vec{x}|\theta_1)>0\}$ as a function of $T(\vec{x})$), then the test of critical region $\text{CR}=\{\vec{x}:\,T(\vec{x})\geq k\}$, where $k$ is chosen so that $\alpha=P(\text{CR}|\theta=\theta_0)$, is the UMPT of size $\alpha$.

In all the proofs I have read (for instance, in page 22 here or in "Statistical inference" by Casella-Berger, 2n edition, page 391), it is (more or less) said: "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})<k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}< k_1$". I would understand that statement if the likehood ratio were strictly increasing, but what about the case in which it is constant? For example, if $X\sim U(0,\theta)$, the likelihood ratio is monotone on $T(\vec{x})=\max_{1\leq i\leq n}x_i$ ($n$ is the length of the sample $\vec{x}$), but not strictly increasing.

EDIT: My questions are:

  1. Is the assertion between quotation marks true for every density or mass function with (not strictly) monotone likelihood ratio on $T$?

  2. And what about in the case of the uniform distribution?

The second question has an answer below. I would like an answer for the first question, with claims based on real-analysis.

user39756
  • 1,659

2 Answers2

3

Karlin- Rubin assumes that the maximum likelihood ratio exists and one of the "regular" conditions for it is that the support is independent of the parametric space. Let's examine the Uniform case with $X\sim U[0,\theta]$. If $H_0: \theta \le \theta_0$ and $H_A: \theta > \theta_0$, then $$ \frac{f_1 }{ f_0} = \frac{1/\theta_1^n I\{0\le X_{(n}) \le \theta_1\}}{1/\theta_0^n I\{0\le X_{(n}) \le \theta_0\}}, $$ So, for $X_{(n)} \le \theta_0 <\theta_1,\,\, \forall \theta_1 \in \Theta_A$ , you have that $$ \frac{f_1}{f_0} = (\theta_0/\theta_1)^{n}. $$ But if $\theta_0 < X_{(n)} $, then the ration $f_1/ f_0$ is undefined as $f_0 = 0$ and $f_1 = 1/\theta_1^n$.

As such, for a good test, it is useful to based it on the sufficient statistic $X_{(n)}$ and build it more intuitively, i.e., $$ E_{H_0} I\{c \le X_{(n)} \le \theta_0\} = \alpha, $$ thus $$ E_{H_0} I\{ (c/\theta_0)^n \le (X_{(n)}/\theta_0)^{n} \le 1\} = 1 - (c/\theta_0)^n = \alpha, $$ hence, $$ c = \theta_0 (1- \alpha)^{1/n}. $$ As such the UMPT will be $$ \Psi\{ \theta_0(1-\alpha)^{1/n} \le X_{(n)} \}, $$ where the Type~I error will be at most $\alpha$ for $\theta_0(1-\alpha)^{1/n} \le X_{(n)} \le \theta_0$, and $0$ for $X_{(n)} > \theta_0$.

V. Vancak
  • 16,927
  • Thank you for your answer! I have two doubts: when you write the critical region as ${c\leq x_{(n)}\leq\theta_0}$, why do you assume $x_{(n)}\leq\theta_0$? And how do you know that this test is the UMPT? – user39756 Jan 09 '17 at 20:11
  • And concerning this assertion: "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})<k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}< k_1$", what do you think? – user39756 Jan 10 '17 at 09:06
  • I'm not. I'm dividing it into two possible cases. Where in the first one $x_{(n)} \le \theta_0$ and in the second one $x_{(n)} > \theta_0$. Unlike in the regular cases, $P_{\theta_0}(X\ge \theta_0)=0$, however if $H_1$ true then $P(x_{(n)} > \theta_0) = 1 - \theta_0/\theta >0$.
  • UMPT - will check later in Cassela's inference, Don't remember the exact argument.
  • This ratio is not well defined. And for the case it is ($x_{(n)} \le \theta_0$), the dependence on the sufficient statistic is only through the indicator function.
  • – V. Vancak Jan 10 '17 at 14:57
  • For last comment "we can find $k_1$ such that, if $T(\vec{x})\geq k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}\geq k_1$, and if $T(\vec{x})<k$, then $\frac{f(\vec{x}|\theta_2)}{f(\vec{x}|\theta_1)}< k_1$", I am interested on whether it is true or not in general (for any density or mass function, not just uniform), because that assertion is presented in all proofs of Karlin-Rubin I have read. – user39756 Jan 11 '17 at 09:09