M-estimator Algorithm
(Redirected from M-estimator)
Jump to navigation
Jump to search
An M-estimator Algorithm is a point estimation algorithm that searches for a minima of sums of functions of the data.
- See: Least-Squares Estimator, Maximum Likelihood Estimation Algorithm, Continuous Optimization Algorithm.
References
2012
- http://en.wikipedia.org/wiki/M-estimation
- In statistics, M-estimators are a broad class of estimators, which are obtained as the minima of sums of functions of the data. Least-squares estimators and many maximum-likelihood estimators are M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation.
More generally, an M-estimator may be defined to be a zero of an estimating function.[1][2][3][4][5][6] This estimating function is often the derivative of another statistical function: For example, a maximum-likelihood estimate is often defined to be a zero of the derivative of the likelihood function with respect to the parameter: thus, a maximum-likelihood estimator is often a critical point of the score function.[7] In many applications, such M-estimators can be thought of as estimating characteristics of the population.
- In statistics, M-estimators are a broad class of estimators, which are obtained as the minima of sums of functions of the data. Least-squares estimators and many maximum-likelihood estimators are M-estimators. The definition of M-estimators was motivated by robust statistics, which contributed new types of M-estimators. The statistical procedure of evaluating an M-estimator on a data set is called M-estimation.
- ↑ V. P. Godambe, editor. Estimating functions, volume 7 of Oxford Statistical Science Series. The Clarendon Press Oxford University Press, New York, 1991.
- ↑ Christopher C. Heyde. Quasi-likelihood and its application: A general approach to optimal parameter estimation. Springer Series in Statistics. Springer-Verlag, New York, 1997.
- ↑ D. L. McLeish and Christopher G. Small. The theory and applications of statistical inference functions, volume 44 of Lecture Notes in Statistics. Springer-Verlag, New York, 1988.
- ↑ Parimal Mukhopadhyay. An Introduction to Estimating Functions. Alpha Science International, Ltd, 2004.
- ↑ Christopher G. Small and Jinfang Wang. Numerical methods for nonlinear estimating equations, volume 29 of Oxford Statistical Science Series. The Clarendon Press Oxford University Press, New York, 2003.
- ↑ Sara A. van de Geer. Empirical Processes in M-estimation: Applications of empirical process theory, volume 6 of Cambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 2000.
- ↑ Ferguson, Thomas S. (1982). "An inconsistent maximum likelihood estimate". Journal of the American Statistical Association 77 (380): 831–834. JSTOR 2287314.