Locally Estimated Scatterplot Smoothing (LOESS) Algorithm
A Locally Estimated Scatterplot Smoothing (LOESS) Algorithm is a non-parametric local regression algorithm that ...
- Contect:
- It can be implemented by a LOESS Regression System (to solve a LOESS regression task).
- …
- Counter-Example(s):
- See: Local Regression, Initialism, Scattergram, Non-Parametric Regression, k-Nearest Neighbor Algorithm, Classical Statistics, Least Squares Regression, Non-Linear Regression.
References
2018
- (Wikipedia, 2018) ⇒ https://en.wikipedia.org/wiki/Local_regression Retrieved:2018-3-30.
- LOESS and LOWESS (locally weighted scatterplot smoothing) are two strongly related non-parametric regression methods that combine multiple regression models in a k-nearest-neighbor-based meta-model. “LOESS" is a later generalization of LOWESS; although it is not a true acronym, it may be understood as standing for "LOcal regrESSion". [1]
LOESS and LOWESS thus build on "classical" methods, such as linear and nonlinear least squares regression. They address situations in which the classical procedures do not perform well or cannot be effectively applied without undue labor. LOESS combines much of the simplicity of linear least squares regression with the flexibility of nonlinear regression. It does this by fitting simple models to localized subsets of the data to build up a function that describes the deterministic part of the variation in the data, point by point. In fact, one of the chief attractions of this method is that the data analyst is not required to specify a global function of any form to fit a model to the data, only to fit segments of the data.
The trade-off for these features is increased computation. Because it is so computationally intensive, LOESS would have been practically impossible to use in the era when least squares regression was being developed. Most other modern methods for process modeling are similar to LOESS in this respect. These methods have been consciously designed to use our current computational ability to the fullest possible advantage to achieve goals not easily achieved by traditional approaches.
A smooth curve through a set of data points obtained with this statistical technique is called a Loess Curve, particularly when each smoothed value is given by a weighted quadratic least squares regression over the span of values of the y-axis scattergram criterion variable. When each smoothed value is given by a weighted linear least squares regression over the span, this is known as a Lowess curve ; however, some authorities treat Lowess and Loess as synonyms.
- LOESS and LOWESS (locally weighted scatterplot smoothing) are two strongly related non-parametric regression methods that combine multiple regression models in a k-nearest-neighbor-based meta-model. “LOESS" is a later generalization of LOWESS; although it is not a true acronym, it may be understood as standing for "LOcal regrESSion". [1]
2017
- https://www.statsdirect.com/help/nonparametric_methods/loess.htm
- QUOTE: LOESS Curve Fitting (Local Polynomial Regression)
This is a method for fitting a smooth curve between two variables, or fitting a smooth surface between an outcome and up to four predictor variables.
The procedure originated as LOWESS (LOcally WEighted Scatter-plot Smoother). Since then it has been extended as a modelling tool because it has some useful statistical properties (Cleveland, 1998).
This is a nonparametric method because the linearity assumptions of conventional regression methods have been relaxed. Instead of estimating parameters like m and c in y = mx +c, a nonparametric regression focuses on the fitted curve. The fitted points and their standard errors represent are estimated with respect to the whole curve rather than a particular estimate. So, the overall uncertainty is measured as how well the estimated curve fits the population curve.
It is called local regression because the fitting at say point x is weighted toward the data nearest to x. The distance from x that is considered near to it is controlled by the span setting, α.When α is less than 1 it represents the proportion of the data that is considered to be neighbouring x, and the weighting that is used is proportional to 1-(distance/maximum distance)^3)^3, which is known as tricubic. When α is greater than 1 all of the points are used and the maximum distance is taken as α^(1/p) times the observed maximum distance for p predictors. The default span is α = 0.75. If you choose a span that is too small then there will be insufficient data near x for an accurate fit, resulting in a large variance. If the span is too large than the regression will be over-smoothed, resulting in a loss of information, hence a large bias.
The trade-off between bias and variance also depends on the degree of the polynomial selected. A high degree will provide a better approximation of the population mean, so less bias, but there are more factors to consider in the model, resulting in greater variance. The default degree is 2 (quadratic). Higher degrees don't improve the fit much. The lower degree (i.e. 1, linear) has more bias but pulls back variance at the boundaries.
- QUOTE: LOESS Curve Fitting (Local Polynomial Regression)
2012
- https://github.com/franzejr/Machine-Learning-Algorithms/blob/master/methods/loess.py
- QUOTE: … LOESS and LOWESS (locally weighted scatterplot smoothing) are two strongly related regression modeling methods that combine multiple regression models in a k-nearest-neighbor-based meta-model. “LOESS” is a later generalization of LOWESS; although it is not a true initialism, it may be understood as standing for "LOcal regrESSion" …