Locally Weighted Linear Regression Algorithm
(Redirected from Locally Weighted Linear Regression (LWLR))
Jump to navigation
Jump to search
A Locally Weighted Linear Regression Algorithm is a locally weighted regression algorithm that is a linear regression algorithm.
- See: LOESS.
References
2017
- https://www.cs.cmu.edu/afs/cs/project/jair/pub/volume4/cohn96a-html/node7.html
- QUOTE: Model-based methods, such as neural networks and the mixture of Gaussians, use the data to build a parameterized model. After training, the model is used for predictions and the data are generally discarded. In contrast, “memory-based methods are non-parametric approaches that explicitly retain the training data, and use it each time a prediction needs to be made. Locally weighted regression (LWR) is a memory-based method that performs a regression around a point of interest using only training data that are “local to that point. One recent study demonstrated that LWR was suitable for real-time control by constructing an LWR-based system that learned a difficult juggling task [Schaal & Atkeson 1994]. ...
2006
- (Chu et al., 2006) ⇒ Cheng-Tao Chu, Sang Kyun Kim, Yi-An Lin, YuanYuan Yu, Gary Bradski, Andrew Y. Ng, and Kunle Olukotun. (2006). “Map-Reduce for Machine Learning on Multicore.” In: Proceedings of the 19th International Conference on Neural Information Processing Systems (NIPS-2006).
- QUOTE: Locally Weighted Linear Regression (LWLR): LWLR [28, 3] is solved by finding the solution of the normal equations [math]\displaystyle{ A\theta = b }[/math], where [math]\displaystyle{ A = \Sigma^m_{i=1} w_i(x_ix^T_i P) }[/math] and [math]\displaystyle{ B = \Sigma^m_{i=1} w_i(x_iy_i) }[/math]. For the summation form, we divide the computation among different mappers. In this case, one set of mappers is used to compute [math]\displaystyle{ P_\text{subgroup} w_i(x_ix^T_i) }[/math] and another set to compute [math]\displaystyle{ P_\text{subgroup} w_i(x_iy_i) }[/math]. Two reducers respectively sum up the partial values for A and b, and the algorithm finally computes the solution [math]\displaystyle{ \theta = A^{-1}b }[/math]. Note that if [math]\displaystyle{ w_i = 1 }[/math], the algorithm reduces to the case of ordinary least squares (linear regression).