Consider estimation of g(\theta) based on data X_1, X_2, \ldots, X_n
i.i.d. from some member of a family of densities p_\theta, \theta \in \Omega, where \Omega is the parameter space. An estimator \delta(X_1, X_2, \ldots, X_n) of g(\theta) is unbiased if \operatorname{E}(\delta(X_1, X_2, \ldots, X_n))=\operatorname{E}(g(\theta)). Furthermore, an unbiased estimator is
UMVUE if \forall \theta \in \Omega, \operatorname{var}(\delta(X_1, X_2, \ldots, X_n)) \leq \operatorname{var}(\tilde{\delta}(X_1, X_2, \ldots, X_n)) for any other unbiased estimator \tilde{\delta}. If an unbiased estimator of g(\theta) exists, then one can prove there is an essentially unique MVUE. Using the
Rao–Blackwell theorem one can also prove that determining the MVUE is simply a matter of finding a
complete sufficient statistic for the family p_\theta, \theta \in \Omega and conditioning
any unbiased estimator on it. Further, by the
Lehmann–Scheffé theorem, an unbiased estimator that is a function of a complete, sufficient statistic is the UMVUE. Put formally, suppose \delta(X_1, X_2, \ldots, X_n) is unbiased for g(\theta), and that T is a complete sufficient statistic for the family of densities. Then \eta(X_1, X_2, \ldots, X_n) = \operatorname{E}(\delta(X_1, X_2, \ldots, X_n)\mid T)\, is the MVUE for g(\theta). A
Bayesian analog is a
Bayes estimator, particularly with
minimum mean square error (MMSE). ==Estimator selection==