This document provides an overview of statistical inference concepts including:
1. Best unbiased estimators, which have minimum mean squared error for a given parameter. The best unbiased estimator, if it exists, must be a function of a sufficient statistic.
2. Sufficiency and the Rao-Blackwell theorem, which states that conditioning an estimator on a sufficient statistic produces a uniformly better estimator.
3. The Cramér-Rao lower bound, which provides a lower bound on the variance of unbiased estimators. Examples are given to illustrate key concepts like when the bound may not hold.
4. Examples are worked through to find minimum variance unbiased estimators, maximum likelihood estimators, and confidence intervals for various distributions