Let ${X_1},...,{X_n}$ be a random sample from uniform distribution $U\left[ {0,\theta } \right]$. Find the MLE of $\theta $.
I wrote the likelihood function of this distribution, that being a uniform one has $f\left( x \right) = {1 \over {b - a}} = {1 \over \theta }$ as $$l\left( \theta \right) = \prod\limits_{i = 1}^n {{1 \over \theta }} $$
which has an equivalent loglikelihood function $$\log \left( l \right) = \sum\limits_{i = 1}^n {\left[ {\log \left( {{\theta ^{ - 1}}} \right)} \right]} = - n\log \left( \theta \right)$$
Now if I want to find the MLE, I'll set its first derivative to zero: $$\eqalign{ & {\partial \over {\partial \theta }}\left( { - n\log \theta } \right) = 0 \cr & {{ - n} \over \theta } = 0 \cr} $$
And that's where things stopped making sense for me, for I can't solve for $\theta $ like I normally can (for other distributions). Checking a bunch of different threads in this community, apparently writing $\theta = {X_{(n)}}$, that is, the highest observation from the sample, is enough as an answer.
However, I can't stop thinking about the fact that no ${X_{(n)}}$ on the planet would make the first derivative be equal to zero, so how can that be a valid estimator?