Locally Weighted Projection Regression Task
A Locally Weighted Projection Regression Task is a Locally Wieghted Regression Task that ...
- AKA: LWPR, Locally Weighted Projection Regression.
- It can be solved by Locally Weighted Projection Regression System that implements a Locally Weighted Projection Algorithm.
- Example(s):
- ...
- Counter-Example(s):
- See: Initialism, Scattergram, Non-Parametric Regression, k-Nearest Neighbor Algorithm, Classical Statistics, Least Squares Regression, Nonlinear Regression.
References
2017
- (Ting et al., 2017) ⇒ Jo-Anne Ting, Franzisk Meier, Sethu Vijayakumar, Stefan Schaal (2017) "Locally Weighted Regression for Control" in "Encyclopedia of Machine Learning and Data Mining" (2017) pp 759-772
- QUOTE: Locally Weighted Projection Regression (LWPR)
Schaal and Atkeson (1998) suggested a memoryless version of LWR, called RFWR, in order to avoid the expensive nearest neighbor computations — particularly for large training data sets — of LWR and to have fast real-time (In most robotic systems, “real time” means on the order of maximally 1–10 ms computation time, corresponding to a 1000 to 100 Hz control loop.) prediction performance. The main ideas of the RFWR algorithm (Schaal and Atkeson, 1998) are listed below:
- Create new kernels only if no existing kernel in memory covers a training point with some minimal activation weight.
- Keep all created kernels in memory and update the weighted regression with weighted recursive least squares for new training points [math]\displaystyle{ \{\mathbf{x},t\} }[/math]:
[math]\displaystyle{ \begin{array}{rcl} \boldsymbol{\beta }_{k}^{n+1}& =& \boldsymbol{\beta }_{ k}^{n} + w\mathbf{P}^{n+1}\tilde{\mathbf{x}}\left (t -\tilde{\mathbf{x}}^{T}\boldsymbol{\beta }_{ k}^{n}\right ) \\ \mbox{ where }\mathbf{P}_{k}^{n+1}& =& \frac{1} {\lambda } \left (\mathbf{P}_{k}^{n}\,-\,\frac{\mathbf{P}_{k}^{n}\tilde{\mathbf{x}}\tilde{\mathbf{x}}^{T}\mathbf{P}_{ k}^{n}} { \frac{\lambda } {w} +\tilde{ \mathbf{x}}^{T}\mathbf{P}_{k}^{n}\tilde{\mathbf{x}}}\right )\mbox{ and }\tilde{\mathbf{x}} \\ & =& \left [\mathbf{x}^{T}\;1\right ]^{T}. {}\end{array}\quad\quad }[/math](6)
- Adjust the distance metric [math]\displaystyle{ \mathbf{D}_q }[/math] for each kernel with a gradient descent technique using leave-one-out cross validation.
- Make a prediction for a query point taking a weighted average of predictions from all local models:
[math]\displaystyle{ \begin{array}{rcl} \mathbf{y}_{q}& =& \frac{\sum _{k=1}^{K}w_{q,k}\hat{\mathbf{y}}_{q,k}} {\sum _{k=1}^{K}w_{q,k}} {}\end{array}\quad\quad(7) }[/math]
Adjusting the distance metric [math]\displaystyle{ \mathbf{D}_q }[/math] with leave-one-out cross validation without keeping all training data in memory is possible due to the PRESS residual. The PRESS residual allows the leave-one-out cross validation error to be computed in closed form without needing to actually exclude a data point from the training data.
Another deficiency of LWR is its inability to scale well to high-dimensional input spaces since the covariance matrix inversion in Eq.(5) becomes severely ill-conditioned. Additionally, LWR becomes expensive to evaluate as the number of local models to be maintained increases. Vijayakumar et al.(2005) suggested local dimensionality reduction techniques to handle this problem. Partial least squares (PLS) regression is a useful dimensionality reduction method that is used in the LWPR algorithm (Vijayakumar et al. 2005). In contrast to PCA methods, PLS performs dimensionality reduction for regression, i.e., it eliminates subspaces of the input space that minimally correlates with the outputs, not just parts of the input space that have low variance.
While LWPR is typically used in conjunction with linear local models, the use of local nonparametric models, such as Gaussian processes, has also been explored (Nguyen-Tuong et al., 2008). Finally, LWPR is currently one of the best developed locally weighted regression algorithms for control (Klanke et al. 2008) and has been applied to learning control problems with over 100 input dimensions.
- QUOTE: Locally Weighted Projection Regression (LWPR)