3

Convergence in probability of $p^{\text{th}}$ quantile estimator for iid sample $X_1, \ldots X_n$ from Exponential distribution given by $f(x, \lambda) = \lambda e^{-\lambda x}$

The $p^{\text{th}}$ is given by $\theta = F^{-1}(p)$ where $p\in(0,1)$ and $F$ is the CDF.

I found that for exponential distribution, $\theta = -\frac{\ln(1-p)}{\lambda}$ and the MLE estimator $\hat{\theta} = -\ln(1-p) \bar{x}$.

How can I use this to show that $\hat{\theta} \to^{P} \theta$

Is the following method correct?

$$\bar{x} \to^P E(X_1) = \frac{1}{\lambda} \;\;\;\;\text{(From Weak Law of Large Numbers)}$$

$$\Rightarrow \bar{x} \to^{d} \frac{1}{\lambda} \;\;\;\;\text{(Convergence in probability implies convergence in distribution)}$$

$$\Rightarrow -\ln(1-p)\bar{x} \to^{d} -\frac{\ln(1-p)}{\lambda} \;\;\;\;\text{(Using Slutky's theorem)}$$

$$\Rightarrow -\ln(1-p)\bar{x} \to^{p} -\frac{\ln(1-p)}{\lambda} \;\;\;\;\text{(Convergence in distribution to a constant implies convergence in probability)}$$

I'm not sure about the last step, is it correct that $-\frac{\ln(1-p)}{\lambda} $ is a constant?

chesslad
  • 2,623

1 Answers1

2

It is correct that $-\frac{\ln(1-p)}{\lambda}$ is a constant. However, I do not see why you go back to convergence in distribution. All you need is that if $Y_n\to Y$ in probability and $c$ is constant then $cY_n\to cY$ in probability.

Davide Giraudo
  • 181,608