1

I am currently trying to prove that $\hat{\beta}$ is not an unbiased estimator of $\beta$. After proving this I need to to construct a unbiased estimator for $\beta$. I know that $$\hat{\beta} = \frac{n}{\sum_{i=1}^n\ln((x_i+1))} \hspace{1.4cm} \beta > 1$$ I also was given that a r.v. $W = \ln(x+1)$ Using $$f_X(x) = \frac{\beta}{(1+x)^{\beta+1}} $$ $$f_W(w) = f_X(v(x)) \cdot |v'(x)|$$ $$W = \ln(1+x) \rightarrow e^w - 1 = x = v(x)$$ $$v'(x) = e^w$$ Which resulted in the $$f_W(w) = \frac{\beta }{\left(1+\left(e^w-1\right)\right)^{\beta +1}}\:\cdot \left|e^w\right| = \frac{\beta}{e^{\beta w}} = \beta e^{-\beta w}$$.

I know I can use Jensen's inequality here to show that it not an unbiased estimator by showing that $$\hat{\beta} = \frac{n}{\sum_{i=1}^n\ln((x_i+1))} = \frac{n}{\sum_{i=1}^n W} = \frac{1}{\bar{W}}$$. However I do not know how to proceed from here in using Jensen's inequality as well as constructing an unbiased estimator.

air bmx
  • 141
  • Why don't you (i) construct the likelihood function using $f_w$ and $f_x$ and find an unbiased estimator, which you have to do anyway, and (ii) show that the estimators have different plims, so that your first estimator cannot be unbiased? –  Apr 02 '20 at 04:43
  • I do not know how to do as you suggested and I also do not know what plims means? – air bmx Apr 02 '20 at 04:53
  • 1
    Very similar question: https://math.stackexchange.com/q/3543552?lq=1, from which you would find that an unbiased estimator of $\beta$ is $(n-1)/\sum \ln(X_i+1)$. – StubbornAtom Apr 02 '20 at 08:12

1 Answers1

1

Hint:

$W=\sum_{i=1}^{n} \ln (1+X_i)\sim Gamma(n,\beta) $

$f_W(w)=\frac{\beta^n}{\Gamma(n)}w^{n-1} e^{-\beta w}$

$E(W^r)=\frac{\Gamma(n+r)}{\Gamma(n)}\frac{1}{\beta^r}$

Set $r=-1$

Masoud
  • 2,755