Method of Moments Algorithm
Jump to navigation
Jump to search
A Method of Moments Algorithm is a parameter estimation algorithm based on the expected values (for the parameters).
- AKA: MoM.
- Context:
- It can be applied by a Method of Moments System (to solve a Method of Moments Task).
- It can (typically) try to equate Sample Moments with Theoretical Moments.
- …
- Counter-Example(s):
- See: Probability Weighted Moments, Estimation, Statistical Parameter, Moment (Mathematics), Expected Value, Random Variable, Method of Moments (Probability Theory).
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/method_of_moments_(statistics) Retrieved:2015-1-14.
- In statistics, the method of moments is a method of estimation of population parameters. One starts with deriving equations that relate the population moments (i.e., the expected values of powers of the random variable under consideration) to the parameters of interest. Then a sample is drawn and the population moments are estimated from the sample. The equations are then solved for the parameters of interest, using the sample moments in place of the (unknown) population moments. This results in estimates of those parameters. The method of moments was introduced by Karl Pearson in 1894.
2014
- http://www.encyclopediaofmath.org/index.php/Moments,_method_of_%28in_probability_theory%29
- A method for determining a probability distribution by its moments (cf. Moment). Theoretically the method of moments is based on the uniqueness of the solution of the moment problem: If … are constants, then under what conditions does there exist a unique distribution … such that … are the moments of … for all ...? There are various types of sufficient conditions for a distribution to be uniquely determined by its moments, for example, the Carleman condition ...
2013
- http://onlinecourses.science.psu.edu/stat414/node/193
- QUOTE: In short, the method of moments involves equating sample moments with theoretical moments. So, let's start by making sure we recall the definitions of theoretical moments, as well as learn the definitions of sample moments.
Definitions.
- E(Xk) is the kth (theoretical) moment of the distribution (about the origin), for k = 1, 2, ...
- E[(X−μ)k] is the kth (theoretical) moment of the distribution (about the mean), for k = 1, 2, ...
- Mk=1n∑i=1nXki is the kth sample moment, for k = 1, 2, ...
- M∗k=1n∑i=1n(Xi−X¯)k is the kth sample moment about the mean, for k = 1, 2, ...
- QUOTE: In short, the method of moments involves equating sample moments with theoretical moments. So, let's start by making sure we recall the definitions of theoretical moments, as well as learn the definitions of sample moments.