Forward Selection Stepwise Regression Algorithm
(Redirected from Forward Selection Stepwise Regression)
Jump to navigation
Jump to search
A Forward Selection Stepwise Regression Algorithm is a stepwise regression algorithm that starts with few predictor variables and generally adds predictor variables with each iteration.
- AKA: Forward Stepwise Estimation.
- …
- Counter-Example(s):
- See: Boosting Algorithm, Stepwise Regression, SCAD, Adaptive LASSO.
References
2015
- (Wikipedia, 2015) ⇒ http://en.wikipedia.org/wiki/stepwise_regression#Main_approaches Retrieved:2015-1-27.
- The main approaches are:
- Forward selection, which involves starting with no variables in the model, testing the addition of each variable using a chosen model comparison criterion, adding the variable (if any) that improves the model the most, and repeating this process until none improves the model.
- The main approaches are:
2009
- (Wang, 2009) ⇒ Hansheng Wang. (2009). “Forward Regression for Ultra-High Dimensional Variable Screening.” In: Journal of the American Statistical Association, 104(488).
- ABSTRACT: Motivated by the seminal theory of Sure Independence Screening (Fan and Lv 2008, SIS), we investigate here another popular and classical variable screening method, namely, forward regression (FR). Our theoretical analysis reveals that FR can identify all relevant predictors consistently, even if the predictor dimension is substantially larger than the sample size. In particular, if the dimension of the true model is finite, FR can discover all relevant predictors within a finite number of steps. To practically select the “best” candidate from the models generated by FR, the recently proposed BIC criterion of Chen and Chen (2008) can be used. The resulting model can then serve as an excellent starting point, from where many existing variable selection methods (e.g., SCAD and Adaptive LASSO) can be applied directly. FR’s outstanding finite sample performances are confirmed by extensive numerical studies.