Locally Weighted Projection Regression System
A Locally Weighted Projection Regression System is an nonparametric regression system which implements a Locally Weighted Projection Regression Algorithm to solve a Locally Weighted Projection Regression Task.
- AKA: LWR System.
- Example(s):
- Counter-Example(s):
- See: Nonparametric Regression Task, Lazy Learning Task, Distance Metric, Weighting Function, Distance Function, k-Nearest Neighbor (kNN) Algorithm.
References
2017a
- (mloss.org, 2017) ⇒ "LWPR 1.2.4" http://mloss.org/software/view/65/ Retrieved: 2017-09-10
- QUOTE: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its core, it uses locally linear models, spanned by a small number of univariate regressions in selected directions in input space. A locally weighted variant of Partial Least Squares (PLS) is employed for doing the dimensionality reduction. This nonparametric local learning system 1) learns rapidly with second order learning methods based on incremental training, 2) uses statistically sound stochastic cross validation to learn, 3) adjusts its weighting kernels based on local information only, 4) has a computational complexity that is linear in the number of inputs, and 5) can deal with a large number of - possibly redundant - inputs, as shown in evaluations with up to 50 dimensional data sets. To our knowledge, this is the first truly incremental spatially localized learning method to combine all these properties.
We have implemented the LWPR algorithm in plain ANSI C, with wrappers and bindings for C++, Matlab/Octave, and Python (using Numpy). LWPR models can be stored in platform-dependent binary files or optionally in platform-independent, human-readable XML files. The latter functionality relies on the XML parser Expat as the only dependency.
LWPR models are fully interchangeable between the different implementations, that is, you could train a regression model in Matlab, store it to a file, and load it from a Python script or C++ program to calculate predictions. Just as well, you could train a model from real robot data collected online in C/C++, and later inspect the LWPR model comfortably within Matlab.
- QUOTE: Locally Weighted Projection Regression (LWPR) is a recent algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its core, it uses locally linear models, spanned by a small number of univariate regressions in selected directions in input space. A locally weighted variant of Partial Least Squares (PLS) is employed for doing the dimensionality reduction. This nonparametric local learning system 1) learns rapidly with second order learning methods based on incremental training, 2) uses statistically sound stochastic cross validation to learn, 3) adjusts its weighting kernels based on local information only, 4) has a computational complexity that is linear in the number of inputs, and 5) can deal with a large number of - possibly redundant - inputs, as shown in evaluations with up to 50 dimensional data sets. To our knowledge, this is the first truly incremental spatially localized learning method to combine all these properties.
2017b
- (University of Edinburgh, 2017) ⇒ The University of Edinburgh 2014-2017. “Software: LWPR" http://wcms.inf.ed.ac.uk/ipab/slmc/research/software-lwpr Retrieved: 2017-09-10
- QUOTE: Locally Weighted Projection Regression (LWPR) is an algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its core, it uses locally linear models, spanned by a small number of univariate regressions in selected directions in input space. A locally weighted variant of Partial Least Squares (PLS) is employed for doing the dimensionality reduction. This nonparametric local learning system
- learns rapidly with second order learning methods based on incremental training,
- uses statistically sound stochastic cross validation to learn,
- adjusts its weighting kernels based on local information only,
- has a computational complexity that is linear in the number of inputs, and
- can deal with a large number of - possibly redundant - inputs,
as shown in evaluations with up to 50 dimensional data sets. To our knowledge, this is the first truly incremental spatially localized learning method to combine all these properties.
- QUOTE: Locally Weighted Projection Regression (LWPR) is an algorithm that achieves nonlinear function approximation in high dimensional spaces with redundant and irrelevant input dimensions. At its core, it uses locally linear models, spanned by a small number of univariate regressions in selected directions in input space. A locally weighted variant of Partial Least Squares (PLS) is employed for doing the dimensionality reduction. This nonparametric local learning system
2008
- (Klanke, 2008) ⇒ Klanke, S., Vijayakumar, S., & Schaal, S. (2008). A library for locally weighted projection regression. Journal of Machine Learning Research, 9(Apr), 623-626. [4]
- QUOTE: In this paper we introduce an improved implementation of locally weighted projection regression (LWPR), a supervised learning algorithm that is capable of handling high-dimensional input data. As the key features, our code supports multi-threading, is available for multiple platforms, and provides wrappers for several programming languages.