Nonparametric Regression Algorithm
(Redirected from nonparametric regression)
Jump to navigation
Jump to search
A Nonparametric Regression Algorithm is a regression algorithm that does not require a parameterized regression model as an input.
- Context:
- It can (often) infer Distribution Functions from the Input Dataset.
- It can be implemented by Nonparametric Regression System (to solve a non-parametric regression task).
- Example(s):
- an Additive Model, such as a generalized additive model.
- Kernel Regression.
- MARS algorithm.
- …
- Counter-Example(s):
- See: Regression Tree Training Algorithm.
References
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Nonparametric_regression Retrieved:2014-8-26.
- Nonparametric regression is a form of regression analysis in which the predictor does not take a predetermined form but is constructed according to information derived from the data. Nonparametric regression requires larger sample sizes than regression based on parametric models because the data must supply the model structure as well as the model estimates.
- (Badanidiyuru et al., 2014) ⇒ Ashwinkumar Badanidiyuru, Baharan Mirzasoleiman, Amin Karbasi, and Andreas Krause. (2014). “Streaming Submodular Maximization: Massive Data Summarization on the Fly.” In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ISBN:978-1-4503-2956-9 doi:10.1145/2623330.2623637
2005
- (Hastie, 2005) ⇒ Trevor Hastie, (2005). http://www-stat.stanford.edu/brochure/part4.html#hastie
- Generalized additive models adapt nonparametric regression technology to provide more flexibility to the usual linear models used in applied areas, such as logistic and log-linear models. Principal curves and surfaces generalize linear principal components by allowing nonlinear coordinate functions.
1993
- (Härdle & Mammen, 1993) ⇒ Wolfgang Härdle, and Enno Mammen. (1993). “Comparing Nonparametric Versus Parametric Regression Fits.” In: The Annals of Statistics, 21(4).
- ABSTRACT: In general, there will be visible differences between a parametric and a nonparametric curve estimate. It is therefore quite natural to compare these in order to decide whether the parametric model could be justified. An asymptotic quantification is the distribution of the integrated squared difference between these curves. We show that the standard way of bootstrapping this statistic fails. We use and analyse a different form of bootstrapping for this task. We call this method the wild bootstrap and apply it to fitting Engel curves in expenditure data analysis.