Random Subspace Method Algorithm
(Redirected from Random Subspace Method)
Jump to navigation
Jump to search
A Random Subspace Method (RMS) Algorithm is an Ensemble Learning Algorithm that aims to increase an ensemble's diversity by restricting classifiers to work on different random subsets of the full feature space.
- AKA: Random Subspaces, Random Subspace Method, Attribute Bagging Algorithm, Feature Bagging Algorithm.
- Context:
- It can be implemented by a Random Subspace Method System to solve a Random Subspace Method Task.
- Example(s):
- Counter-Example(s):
- See: Decision Tree, Ensemble Learning Task, Ensemble Learning System, Machine Learning System, Classification, Outlier Detection Algorithm, Ensemble Learning Estimator, Machine learning.
References
2019
- (Wikipedia, 2019) ⇒ https://en.wikipedia.org/wiki/Random_subspace_method Retrieved:2019-8-18.
- In machine learning the random subspace method,[1] also called attribute bagging or feature bagging, is an ensemble learning method that attempts to reduce the correlation between estimators in an ensemble by training them on random samples of features instead of the entire feature set (...)
An ensemble of models employing the random subspace method can be constructed using the following algorithm:
- Let the number of training points be N and the number of features in the training data be D.
- Choose L to be the number of individual models in the ensemble.
- For each individual model l, choose n<dub>l (nl < N) to be the number of input points for l. It is common to have only one value of nl for all the individual models.
- For each individual model l, create a training set by choosing dl features from D with replacement and train the model.
- In machine learning the random subspace method,[1] also called attribute bagging or feature bagging, is an ensemble learning method that attempts to reduce the correlation between estimators in an ensemble by training them on random samples of features instead of the entire feature set (...)
- Now, to apply the ensemble model to an unseen point, combine the outputs of the L individual models by majority voting or by combining the posterior probabilities.
2017
- (Sammut & Webb, 2017) ⇒ Claude Sammut, and Geoffrey I. Webb. (2017). “Random Subspace Method”. In: (Sammut & Webb, 2017). DOI: 10.1007/978-1-4899-7687-1_696
- QUOTE: The random subspace method is an ensemble learning technique. The principle is to increase diversity between members of the ensemble by restricting classifiers to work on different random subsets of the full feature space. Each classifier learns with a subset of size n, chosen uniformly at random from the full set of size N. Empirical studies have suggested good results can be obtained with the rule-of-thumb to choose n = N∕2 features. The method is generally found to perform best when there are a large number of features (large N), and the discriminative information is spread across them. The method can underperform in the converse situation, when there are few informative features, and a large number of noisy/irrelevant features. Random Forests is an algorithm combining RSM with the Bagging algorithm, which can provide significant gains over each used separately.
2009
- (Li & Zhao, 2009) ⇒ Xiaoye Li, and Hongyu Zhao (2009). "Weighted random subspace method for high dimensional data classification." Statistics and its Interface 2.2 (2009): 153.
- ↑ Ho, Tin Kam (1998). “The Random Subspace Method for Constructing Decision Forests" (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 20 (8): 832–844. doi:10.1109/34.709601.