Averaged One-Dependence Estimator
(Redirected from Averaged one-dependence estimators)
Jump to navigation
Jump to search
An Averaged One-Dependence Estimator is probabilistic classification algorithm that was originally developed to address the attribute-independence problem of the popular naive Bayes classifier.
- See: Bayesian Network; Semi-Naïve Bayesian Learning; Tree-Augmented Naïve Bayes; Naive Bayes Classifier.
References
2014
- (Wikipedia, 2014) ⇒ http://en.wikipedia.org/wiki/Averaged_one-dependence_estimators Retrieved:2014-10-30.
- Averaged one-dependence estimators (AODE) is a probabilistic classification learning technique. It was developed to address the attribute-independence problem of the popular naive Bayes classifier. It frequently develops substantially more accurate classifiers than naive Bayes at the cost of a modest increase in the amount of computation. [1]
- ↑ Webb, G. I., J. Boughton, and Z. Wang (2005). "Not So Naive Bayes: Aggregating One-Dependence Estimators". Machine Learning, 58(1), 5–24.
2011
- (Zheng & Webb, 2011a) ⇒ Fei Zheng; Geoffrey I. Webb. (2011). “Average One-Dependence Estimators.” In: (Sammut & Webb, 2011) p.63
2005
- (Webb et al, 2005) ⇒ G. I. Webb, J. Boughton, and Z. Wang (2005). “Not So Naive Bayes: Aggregating One-Dependence Estimators". In: Machine Learning, 58(1).