I think the question is clear enough. We say unbiased estimator, efficient estimator, consistent estimator, why not sufficient estimator? All estimators are, by definition statistics, although not all statistics are estimators. If this is just a matter of tradition or convention then all is good. I just want to be sure there isn't something more subtle going on that I am missing.
Asked
Active
Viewed 57 times
1
-
3For example if $X$ is a sufficient statistic for parameter $\theta$, then $2 X + 1000000$ is also a sufficient statistic for the same $\theta$, since it contains exactly the same information. But you're not likely to use both of them as estimators of $\theta$: in nearly all cases, at least one of these is going to be a very bad estimator. – Robert Israel Jan 27 '25 at 06:52
-
Makes perfect sense, thank you. – TonyK Jan 27 '25 at 23:59
-
1Suppose you had a normal distribution $N(\theta, \theta^2)$ and you were trying to estimate $\theta$. Any estimator of $\theta$ would have to be one-dimensional like $\theta$ but any sufficient statistic here is at least two-dimensional: $\left(\sum X_i, \sum X_i^2\right)$ is an example of a minimal sufficient statistic in this case. – Henry Apr 04 '25 at 16:19
1 Answers
2
Estimator implies the statistic $T(X)$ and the parameter $\theta$ live in the same space (in order for it make sense to talk about estimation error, unbiasedness, etc.).
This need not be the case for a sufficient statistic. For example, if $X_1,\dots,X_n\sim p_\theta$ i.i.d., then the vector of order statistics $T(X)=(X_{(1)}, \dots, X_{(n)})$ is sufficient for $\theta$ (even though $\theta$ might only be a real number).
DDD
- 159
-
Also perfectly sensible. I'm glad I asked the question even if I may have posted it in the wrong Exchange. – TonyK Jan 28 '25 at 00:03