Let $n \in \mathbb{N}, \, X_i: (0,\infty)^n \rightarrow (0,\infty)$ so that $ X_i(x_1,...,n_n) = x_i \, \forall i \in \{ 1,...,n \}.$ Let $ ((0,\infty)^n, B((0,\infty)^n), (P_\theta)_{\theta \in (0,\infty)}) $ be a statistical modell, so that $ (X_1,....,X_n) $ is under $P_\theta$ a iid family of $ \exp(\frac{1}{\theta}) $ distributed rv´s.
Note that in this case $\forall i \in\{ 1,...,n \}$ and $ \theta \in (0,\infty) $ it holds that $ Var(X_i) = \theta^2 $.
a) Compute $ \forall x\in (0,\infty)^n $ the MLE $ \hat{\theta}_{ML} \in (0,\infty)$ for $ \tau(\theta) = \theta. $
b)Is $ \hat{\theta}_{ML} $ a variance minimizing unbiased estimator for $ \tau(\theta) = \theta $?
to a):
the density of a $ exp(\theta^{-1}) $ distributed rv is: f(x) = $ \theta^{-1} * e^{-\theta^{-1}x} $.
I get the maximum likelihood function:
$L(x,\theta) = \prod_{i=1}^n f(x)$ $= \theta^{-n} \prod_{i=1}^n e^{\theta^{-1}x_i} $
Now i build the log likelihood function and get:
$ \log(L(x,\theta)) = -n\log(\theta) - \frac{1}{\theta} \sum_{i=1}^n x_i $
I set the derivate of the $\log$ likelihood function $=0$ to find the maximum.
$\Rightarrow -\frac{n}{\theta}+\frac{1}{\theta^2} \sum_{i=1}^{n} x_i =0 $
$ \Leftrightarrow \frac{n}{\theta} = \frac{1}{\theta^2} \sum_{i=1}^n x_i $
$ \Leftrightarrow \theta = \frac{\sum_{i=1}^n x_i}{n} $
so i get that my estimator is: $ \hat{\theta}_{ML} = \frac{\sum_{i=1}^n x_i}{n} $
I don´t know why we get the hint that $ \theta^2 = Var(X_i) $. Does someone see my mistake?
to b)
I know how to check that a estimator is unbiased. How do i check if a estimator is a variance minimizing unbiased estimator? Do i have to use Cramer - Rao - Inequality?