Iteratively Reweighted Least Squares Algorithm
(Redirected from Iteratively Reweighted Least Squares)
Jump to navigation
Jump to search
An Iteratively Reweighted Least Squares Algorithm is an weighted least squares algorithm that is an iterative algorithm.
- See: Weighted Least Squares, Maximum Likelihood, Generalized Linear Model, Robust Regression, M-Estimator, Weiszfeld's Algorithm, Geometric Median, Linear Programming, Convex Programming, Gauss–Newton, Levenberg–Marquardt, Linear Programming, Convex Programming.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/iteratively_reweighted_least_squares Retrieved:2015-5-13.
- The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems with objective functions of the form: : [math]\displaystyle{ \underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n \big| y_i - f_i (\boldsymbol\beta) \big|^p, }[/math] by an iterative method in which each step involves solving a weighted least squares problem of the form: : [math]\displaystyle{ \boldsymbol\beta^{(t+1)} = \underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n w_i (\boldsymbol\beta^{(t)}) \big| y_i - f_i (\boldsymbol\beta) \big|^2. }[/math] IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.
Although not a linear regression problem, Weiszfeld's algorithm for approximating the geometric median can also be viewed as a special case of iteratively reweighted least squares, in which the objective function is the sum of distances of the estimator from the samples.
One of the advantages of IRLS over linear programming and convex programming is that it can be used with Gauss–Newton and Levenberg–Marquardt numerical algorithms.
- The method of iteratively reweighted least squares (IRLS) is used to solve certain optimization problems with objective functions of the form: : [math]\displaystyle{ \underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n \big| y_i - f_i (\boldsymbol\beta) \big|^p, }[/math] by an iterative method in which each step involves solving a weighted least squares problem of the form: : [math]\displaystyle{ \boldsymbol\beta^{(t+1)} = \underset{\boldsymbol\beta} {\operatorname{arg\,min}} \sum_{i=1}^n w_i (\boldsymbol\beta^{(t)}) \big| y_i - f_i (\boldsymbol\beta) \big|^2. }[/math] IRLS is used to find the maximum likelihood estimates of a generalized linear model, and in robust regression to find an M-estimator, as a way of mitigating the influence of outliers in an otherwise normally-distributed data set. For example, by minimizing the least absolute error rather than the least square error.
2012
- (Shalizi, 2012) ⇒ Cosma Shalizi. (2012). “Chapter 12 - Logistic Regression.” In: Carnegie Melon University, 36-402, Undergraduate Advanced Data Analysis.