Rotation Forests System
A Rotation Forests System is a decision tree ensemble learning system that applies a Rotation Forests Learning Algorithm to solve a Rotation Forests Learning Task.
- AKA: Rotation Forests Training System.
- Example(s):
- …
- Counter-Example(s):
- See: Decision Tree, Principal Component Analysis.
References
2017a
- (Sammut & Webb, 2017) ⇒ Claude Sammut and Geoffrey I. Webb. (2017). "Rotation Forests" In: "Encyclopedia of Machine Learning and Data Mining"(Editors: Claude Sammut, Geoffrey I. Webb) pp 1116-1116
- QUOTE: Rotation Forests is an ensemble learning technique. It is similar to the Random Forests approach to building decision tree ensembles. In the first step, the original feature set is split randomly into K disjoint subsets. Next, principal components analysis is used to extract n principal component dimensions from each of the K subsets. These are then pooled, and the original data projected linearly into this new feature space. A tree is then built from this data in the usual manner. This process is repeated to create an ensemble of trees, each time with a different random split of the original feature set.
As the tree learning algorithm builds the classification regions using hyperplanes parallel to the feature axes, a small rotation of the axes may lead to a very different tree. The effect of rotating the axes is that classification regions of high accuracy can be constructed with far fewer trees than in Bagging and Adaboost.
- QUOTE: Rotation Forests is an ensemble learning technique. It is similar to the Random Forests approach to building decision tree ensembles. In the first step, the original feature set is split randomly into K disjoint subsets. Next, principal components analysis is used to extract n principal component dimensions from each of the K subsets. These are then pooled, and the original data projected linearly into this new feature space. A tree is then built from this data in the usual manner. This process is repeated to create an ensemble of trees, each time with a different random split of the original feature set.
2017b
- (Wikipedia, 2017) ⇒ https://en.wikipedia.org/wiki/Decision_tree_learning#Decision_tree_types Retrieved:2017-10-15.
- (...) Some techniques, often called ensemble methods, construct more than one decision tree:
- Boosted trees Incrementally building an ensemble by training each new instance to emphasize the training instances previously mis-modeled. A typical example is AdaBoost. These can be used for regression-type and classification-type problems. [1] [2]
- Bootstrap aggregated (or bagged) decision trees, an early ensemble method, builds multiple decision trees by repeatedly resampling training data with replacement, and voting the trees for a consensus prediction. [3]
- A random forest classifier is a specific type of bootstrap aggregating
- Rotation forest - in which every decision tree is trained by first applying principal component analysis (PCA) on a random subset of the input features. [4]
A special case of a decision tree is a decision list, which is a one-sided decision tree, so that every internal node has exactly 1 leaf node and exactly 1 internal node as a child (except for the bottommost node, whose only child is a single leaf node). While less expressive, decision lists are arguably easier to understand than general decision trees due to their added sparsity, permit non-greedy learning methods and monotonic constraints to be imposed.
- (...) Some techniques, often called ensemble methods, construct more than one decision tree:
- ↑ Friedman, J. H. (1999). Stochastic gradient boosting. Stanford University.
- ↑ Hastie, T., Tibshirani, R., Friedman, J. H. (2001). The elements of statistical learning : Data mining, inference, and prediction. New York: Springer Verlag.
- ↑ Breiman, L. (1996). Bagging Predictors. “Machine Learning, 24": pp. 123-140.
- ↑ Rodriguez, J.J. and Kuncheva, L.I. and Alonso, C.J. (2006), Rotation forest: A new classifier ensemble method, IEEE Transactions on Pattern Analysis and Machine Intelligence, 28(10):1619-1630.