sklearn.ensemble.AdaBoostRegressor
A sklearn.ensemble.AdaBoostRegressor is an AdaBoost Regression System within sklearn.ensemble
module.
- Context
- Usage:
- 1) Import AdaBoost Regression System from scikit-learn :
from sklearn.ensemble import AdaBoostRegressor
- 2) Create design matrix
X
and response vectorY
- 3) Create AdaBoost Regressor object:
regressor_model=AdaBoostRegressor([base_estimator=None, n_estimators=50, learning_rate=1.0, loss=’linear’, random_state=None])
- 4) Choose method(s):
fit(X, y[, sample_weight])
, builds an AdaBoost regressor from the training set (X, y).get_params([deep])
, gets parameters for this estimator.predict(X)
, predicts regression value for X.score(X, y[, sample_weight])
, returns the coefficient of determination R^2 of the prediction.set_params(**params)
, sets the parameters of this estimator.staged_predict(X)
, returns staged predictions for X.staged_score(X, y[, sample_weight])
, returns staged scores for X, y.
- 1) Import AdaBoost Regression System from scikit-learn :
- Example(s):
- Counter-Example(s):
sklearn.ensemble.AdaBoostClassifier
.sklearn.ensemble.BaggingClassifier
.sklearn.ensemble.BaggingRegressor
.sklearn.ensemble.ExtraTreesClassifier
.sklearn.ensemble.ExtraTreesRegressor
.sklearn.ensemble.GradientBoostingClassifier
.sklearn.ensemble.GradientBoostingRegressor
.sklearn.ensemble.IsolationForest
.sklearn.ensemble.RandomForestClassifier
.sklearn.ensemble.RandomForestRegressor
.sklearn.ensemble.RandomTreesEmbedding
.sklearn.ensemble.VotingClassifier
.
- See: Decision Tree, Classification System, Regularization Task, Ridge Regression Task, Kernel-based Classification Algorithm.
References
2017a
- (Scikit Learn, 2017B) ⇒ http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostRegressor.html
- QUOTE:
class sklearn.ensemble.AdaBoostRegressor(base_estimator=None, n_estimators=50, learning_rate=1.0, loss=’linear’, random_state=None)
An AdaBoost [1] regressor is a meta-estimator that begins by fitting a regressor on the original dataset and then fits additional copies of the regressor on the same dataset but where the weights of instances are adjusted according to the error of the current prediction. As such, subsequent regressors focus more on difficult cases.
This class implements the algorithm known as AdaBoost.R2 [2].
Read more in the User Guide.
- QUOTE:
2017c
- (Scikit Learn, 2017B) ⇒ http://scikit-learn.org/stable/modules/ensemble.html#AdaBoost Retrieved: 2017-10-22.
- QUOTE: The module sklearn.ensemble includes the popular boosting algorithm AdaBoost, introduced in 1995 by Freund and Schapire [FS1995] [1].
The core principle of AdaBoost is to fit a sequence of weak learners (i.e., models that are only slightly better than random guessing, such as small decision trees) on repeatedly modified versions of the data. The predictions from all of them are then combined through a weighted majority vote (or sum) to produce the final prediction. The data modifications at each so-called boosting iteration consist of applying weights [math]\displaystyle{ w_1, w_2,\cdots, w_N }[/math] to each of the training samples. Initially, those weights are all set to [math]\displaystyle{ w_i = 1/N }[/math], so that the first step simply trains a weak learner on the original data. For each successive iteration, the sample weights are individually modified and the learning algorithm is reapplied to the reweighted data. At a given step, those training examples that were incorrectly predicted by the boosted model induced at the previous step have their weights increased, whereas the weights are decreased for those that were predicted correctly. As iterations proceed, examples that are difficult to predict receive ever-increasing influence. Each subsequent weak learner is thereby forced to concentrate on the examples that are missed by the previous ones in the sequence [HTF][2].
AdaBoost can be used both for classification and regression problems:
- For multi-class classification, AdaBoostClassifier implements AdaBoost-SAMME and AdaBoost-SAMME.R [ZZRH2009][3].
- For regression, AdaBoostRegressor implements AdaBoost.R2 [D1997][4].
- QUOTE: The module sklearn.ensemble includes the popular boosting algorithm AdaBoost, introduced in 1995 by Freund and Schapire [FS1995] [1].
- ↑ Y. Freund, and R. Schapire, “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting”, 1997
- ↑ T. Hastie, R. Tibshirani and J. Friedman, “Elements of Statistical Learning Ed. 2”, Springer, 2009.
- ↑ J. Zhu, H. Zou, S. Rosset, T. Hastie. “Multi-class AdaBoost”, 2009.
- ↑ H.Drucker. “Improving Regressors using Boosting Techniques”, 1997.