I am currently trying to prove that $\hat{\beta}$ is not an unbiased estimator of $\beta$. After proving this I need to to construct a unbiased estimator for $\beta$. I know that $$\hat{\beta} = \frac{n}{\sum_{i=1}^n\ln((x_i+1))} \hspace{1.4cm} \beta > 1$$ I also was given that a r.v. $W = \ln(x+1)$ Using $$f_X(x) = \frac{\beta}{(1+x)^{\beta+1}} $$ $$f_W(w) = f_X(v(x)) \cdot |v'(x)|$$ $$W = \ln(1+x) \rightarrow e^w - 1 = x = v(x)$$ $$v'(x) = e^w$$ Which resulted in the $$f_W(w) = \frac{\beta }{\left(1+\left(e^w-1\right)\right)^{\beta +1}}\:\cdot \left|e^w\right| = \frac{\beta}{e^{\beta w}} = \beta e^{-\beta w}$$.
I know I can use Jensen's inequality here to show that it not an unbiased estimator by showing that $$\hat{\beta} = \frac{n}{\sum_{i=1}^n\ln((x_i+1))} = \frac{n}{\sum_{i=1}^n W} = \frac{1}{\bar{W}}$$. However I do not know how to proceed from here in using Jensen's inequality as well as constructing an unbiased estimator.