Let $\underline{Y}= (Y_1,..., Y_n)$ be an i.i.d. random sample from a Weibull distribution, with probability density function given by $f(x; \lambda)= \frac{k}{\lambda}(\frac{y}{\lambda})^{k-1}exp$ {$-(\frac{y}{\lambda})^k$} where k > $0$ is a known shape parameter, and λ is an unknown scale parameter taking values in $\mathbb{R^+}$.
Consider the parametrisation $\theta= \lambda^k$
Derive the likelihood function $L(\theta; \underline{Y})$ and thus the Maximum likelihood estimator $\hat{\theta}(\underline{Y})$ for $\theta.$ Show that the MLE is unbiased.
What I know so far
take the sum of the pdf up to n to find the likelihood function. take the log and differentiate and then set to $0$ and solve for the MLE. If the expectation is 0 then the estimator is unbiased. I know the method but I am unsure of how to actually put it into practice. Any help would be greatly appreciated.
\fracin exponents or limits of integrals. It looks bad and confusing, and it rarely appears in professional mathematics typesetting. – GNUSupporter 8964民主女神 地下教會 Dec 10 '20 at 10:34