Parametric Supervised Scalar Prediction Algorithm
(Redirected from parametric regression algorithm)
Jump to navigation
Jump to search
A Parametric Supervised Scalar Prediction Algorithm is an supervised scalar prediction algorithm that accepts a parametric model.
- AKA: Parameter Estimation Algorithm.
- Context:
- Algorithm Input: a Regression Model/Parametric Model.
- It can range from being a Parametric Regression Estimation Algorithm/Parameter Estimation Algorithm to being a Parametric Regression Classification Algorithm.
- It can be applied by a Parametric Regression System (that can solve a parametric regression task).
- It can range from being a Regularized Regression Algorithm to being a Non-Regularized Regression Algorithm.
- It can range from being a Linear Regression Algorithm (fits a linear regression model) to being a Nonlinear Regression Algorithm (fits a nonlinear regression model).
- It can (often) be based on a Function Approximation Algorithm.
- …
- Example(s):
- a Linear Regression Algorithm, such as a linear least squares regression algorithm.
- a Non-Linear Regression Algorithm, such as a non-linear least squares regression algorithm.
- a Bayesian Regression Algorithm, that fits a Bayesian Regression Model.
- …
- Counter-Example(s):
- See: Parametric Learning Algorithm.
References
2009
- (Lafferty & Wasserman, 2009) ⇒ John D. Lafferty, and Larry Wasserman. (2009). “Statistical Machine Learning - Course: 10-702." Spring 2009, Carnegie Mellon Institute.
1993
- (Härdle & Mammen, 1993) ⇒ Wolfgang Härdle, and Enno Mammen. (1993). “Comparing Nonparametric Versus Parametric Regression Fits.” In: The Annals of Statistics, 21(4).
- ABSTRACT: In general, there will be visible differences between a parametric and a nonparametric curve estimate. It is therefore quite natural to compare these in order to decide whether the parametric model could be justified. An asymptotic quantification is the distribution of the integrated squared difference between these curves. We show that the standard way of bootstrapping this statistic fails. We use and analyse a different form of bootstrapping for this task. We call this method the wild bootstrap and apply it to fitting Engel curves in expenditure data analysis.