sklearn.tree.ExtraTreeClassifier
Jump to navigation
Jump to search
A sklearn.tree.ExtraTreeClassifier is a Classification Extra-Trees Learning System within sklearn.tree
module.
- AKA: ExtraTreeClassifier, tree.ExtraTreeClassifier.
- Context
- Usage:
- 1) Import Classification Extra-Trees Learning System from scikit-learn :
from sklearn.tree import ExtraTreeClassifier
- 2) Create design matrix
X
and response vectorY
- 3) Create Extra-Trees Classifier object:
ETclf=ExtraTreeClassifier(criterion=’gini’, splitter=’best’[, max_depth=None, min_samples_split=2, min_samples_leaf=1,...])
- 4) Choose method(s):
ETclf
.apply(X[, check_input])
, returns the leaf index for each sample predictor.ETclf
.decision_path(X[, check_input])
, returns the decision path in the tree.ETclf
.fit(X, y[, sample_weight, check_input,...])
builds a decision tree classifier from the training set (X, y).ETclf
.get_params([deep])
returns parameters for this estimator.ETclf
.predict(X[, check_input])
, predicts class for X.ETclf
.predict_log_proba(X)
, predicts class log-probabilities of the input samples X.ETclf
.predict_proba(X[, check_input])
, predicts class probabilities of the input samples X.ETclf
.score(X, y[, sample_weight])
, returns the mean accuracy on the given test data and labels.ETclf
.set_params(**params)
, sets the parameters of this estimator.
- 1) Import Classification Extra-Trees Learning System from scikit-learn :
- Example(s):
- …
- Counter-Example(s):
- See: Decision Tree, Classification System, Regularization Task, Ridge Regression Task, Kernel-based Classification Algorithm.
References
2017
- (Scikit-Learn, 2017) ⇒ http://scikit-learn.org/stable/modules/generated/sklearn.tree.ExtraTreeClassifier.html Retrieved:2017-10-22
- QUOTE:
class sklearn.tree.ExtraTreeClassifier(criterion=’gini’, splitter=’random’, max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features=’auto’, random_state=None, max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, class_weight=None)
An extremely randomized tree classifier.
Extra-trees differ from classic decision trees in the way they are built. When looking for the best split to separate the samples of a node into two groups, random splits are drawn for each of the
max_features
randomly selected features and the best split among those is chosen. Whenmax_features
is set 1, this amounts to building a totally random decision tree.Warning: Extra-trees should only be used within ensemble methods.
Read more in the User Guide.
- QUOTE: