1996 ExperimentswithaNewBoostingAlgo
- (Freund & Schapire, 1996) ⇒ Yoav Freund, and Robert E. Schapire. (1996). “Experiments with a New Boosting Algorithm.” In: Proceedings of the 13th International Conference on Machine Learning (ICML-1996).
Subject Headings: AdaBoost Algorithm.
Notes
Cited By
Quotes
Abstract
In an earlier paper [9], we introduced a new “boosting” algorithm called AdaBoost which, theoretically, can be used to significantly reduce the error of any learning algorithm that consistently generates classifiers whose performance is a little better than random guessing. We also introduced the related notion of a “pseudo-loss” which is a method for forcing a learning algorithm of multi-label concepts to concentrate on the labels that are hardest to discriminate. In this paper, we describe experiments we carried out to assess how well AdaBoost with and without pseudo-loss, performs on real learning problems.
We performed two sets of experiments. The first set compared boosting to [[Breiman’s [1] “bagging” method]] when used to aggregate various classifiers (including decision trees and single attribute-value tests). We compared the performance of the two methods on a collection of machine-learning benchmarks. In the second set of experiments, we studied in more detail the performance of boosting using a nearest-neighbor classifier on an OCR problem..
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
1996 ExperimentswithaNewBoostingAlgo | Yoav Freund Robert E. Schapire | Experiments with a New Boosting Algorithm | 1996 |