In statistics, point estimation involves the use of sample data to calculate a single value (known as a point estimate since it identifies a point in some parameter space) which is to serve as a "best guess" or "best estimate" of an unknown population parameter (for example, the population mean). More formally, it is the application of a point estimator to the data to obtain a point estimate.
Point estimation can be contrasted with interval estimation: such interval estimates are typically either confidence intervals, in the case of frequentist inference, or credible intervals, in the case of Bayesian inference.
YouTube Encyclopedic

1/3Views:22 66329 2077 856

✪ Lesson 13a: Point Estimates

✪ Point Estimates Explain.mp4

✪ Mod04 Lec39 Point Estimation
Transcription
Contents
Point estimators
There are a variety of point estimators, each with different properties.
 minimumvariance meanunbiased estimator (MVUE), minimizes the risk (expected loss) of the squarederror lossfunction.
 best linear unbiased estimator (BLUE)
 minimum mean squared error (MMSE)
 medianunbiased estimator, minimizes the risk of the absoluteerror loss function
 maximum likelihood estimator (MLE)
 method of moments and generalized method of moments
Bayesian point estimation
Bayesian inference is typically based on the posterior distribution. Many Bayesian point estimators are the posterior distribution's statistics of central tendency, e.g., its mean, median, or mode:
 Posterior mean, which minimizes the (posterior) risk (expected loss) for a squarederror loss function; in Bayesian estimation, the risk is defined in terms of the posterior distribution, as observed by Gauss.^{[1]}
 Posterior median, which minimizes the posterior risk for the absolutevalue loss function, as observed by Laplace.^{[1]}^{[2]}
 maximum a posteriori (MAP), which finds a maximum of the posterior distribution; for a uniform prior probability, the MAP estimator coincides with the maximumlikelihood estimator;
The MAP estimator has good asymptotic properties, even for many difficult problems, on which the maximumlikelihood estimator has difficulties. For regular problems, where the maximumlikelihood estimator is consistent, the maximumlikelihood estimator ultimately agrees with the MAP estimator.^{[3]}^{[4]}^{[5]} Bayesian estimators are admissible, by Wald's theorem.^{[4]}^{[6]}
The Minimum Message Length (MML) point estimator is based in Bayesian information theory and is not so directly related to the posterior distribution.
Special cases of Bayesian filters are important:
Several methods of computational statistics have close connections with Bayesian analysis:
Properties of point estimates
See also
Notes
 ^ ^{a} ^{b} Dodge, Yadolah, ed. (1987). Statistical data analysis based on the L1norm and related methods: Papers from the First International Conference held at Neuchâtel, August 31–September 4, 1987. NorthHolland Publishing.
 ^ Jaynes, E. T. (2007). Probability Theory: The logic of science (5. print. ed.). Cambridge University Press. p. 172. ISBN 9780521592710.
 ^ Ferguson, Thomas S. (1996). A Course in Large Sample Theory. Chapman & Hall. ISBN 0412043718.
 ^ ^{a} ^{b} Le Cam, Lucien (1986). Asymptotic Methods in Statistical Decision Theory. SpringerVerlag. ISBN 0387963073.
 ^ Ferguson, Thomas S. (1982). "An inconsistent maximum likelihood estimate". Journal of the American Statistical Association. 77 (380): 831–834. doi:10.1080/01621459.1982.10477894. JSTOR 2287314.
 ^ Lehmann, E. L.; Casella, G. (1998). Theory of Point Estimation (2nd ed.). Springer. ISBN 0387985026.
Bibliography
 Bickel, Peter J. & Doksum, Kjell A. (2001). Mathematical Statistics: Basic and Selected Topics. I (Second (updated printing 2007) ed.). Pearson PrenticeHall.
 Liese, Friedrich & Miescke, KlausJ. (2008). Statistical Decision Theory: Estimation, Testing, and Selection. Springer.