One-vs-Rest Multiclass Classification Algorithm
An One-vs-Rest Multiclass Classification Algorithm is a binary-based supervised multiclass classification algorithm that first converts a supervised multi-class classification task (with [math]\displaystyle{ n }[/math]-classes) into [math]\displaystyle{ n }[/math] binary classification tasks.
- AKA: One-vs-All Algorithm.
- Context:
- It can be applied by a One-vs-All Multiclass Supervised Classification System.
- Example(s):
- Counter-Example(s):
- See: Binary-based Multiclass Supervised Classification Algorithm.
References
2013
- http://scikit-learn.org/stable/modules/generated/sklearn.multiclass.OneVsRestClassifier.html#sklearn.multiclass.OneVsRestClassifier
- One-vs-the-rest (OvR) multiclass/multilabel strategy
Also known as one-vs-all, this strategy consists in fitting one classifier per class. For each classifier, the class is fitted against all the other classes. In addition to its computational efficiency (only n_classes classifiers are needed), one advantage of this approach is its interpretability. Since each class is represented by one and one classifier only, it is possible to gain knowledge about the class by inspecting its corresponding classifier. This is the most commonly used strategy for multiclass classification and is a fair default choice.
This strategy can also be used for multilabel learning, where a classifier is used to predict multiple labels for instance, by fitting on a 2-d matrix in which cell [i, j] is 1 if sample i has label j and 0 otherwise.
- One-vs-the-rest (OvR) multiclass/multilabel strategy
2009
- (Rifkin, 2009) ⇒ Ryan Rifkin. (2009). “Multiclass Classification.” In: MIT Course, 9.520: Statistical Learning Theory and Applications, Spring 2009.
- QUOTE: OVA and AVA are so simple that many people invented them independently. It’s hard to write papers about them. So there’s a whole cottage industry in fancy, sophisticated methods for multiclass classification. To the best of my knowledge, choosing properly tuned regularization classifiers (RLSC, SVM) as your underlying binary classifiers and using one-vs-all (OVA) or all-vs-all (AVA) works as well as anything else you can do. If you actually have to solve a multiclass problem, I strongly urge you to simply use OVA or AVA, and not worry about anything else. The choice between OVA and AVA is largely computational.
2004
- (Rifkin & Klatau, 2004) ⇒ Ryan Rifkin, and Aldebaro Klautau. (2004). “In Defense of One-Vs-All Classification.” In: The Journal of Machine Learning Research, 5.