Instance-based Supervised Learning Algorithm
(Redirected from Instance-based Learning Algorithm)
Jump to navigation
Jump to search
An Instance-based Supervised Learning Algorithm is a supervised learning algorithm that uses the learning instances themselves to make predictions (it does not generalize in terms of a higher language than the instances themselves).
- Context:
- It can range from being a Lazy Instance-based Learning Algorithm to being an Eager Instance-based Learning Algorithm.
- Example(s):
- Counter-Example(s):
- a Model-based Learning Algorithm, such as a DTree learning algorithm or a Linear Regression Algorithm.
- See: Model-based Learning Algorithm.
References
2011
- (Keogh, 2011a) ⇒ Eamonn Keogh. (2011). “Instance-Based Learning.” In: (Sammut & Webb, 2011) p.549
- QUOTE: Instance-based learning refers to a family of techniques for classification and regression, which produce a class label/predication based on the similarity of the query to its nearest neighbor(s) in the training set. In explicit contrast to other methods such as decision trees and neural networks, instance-based learning algorithms do not create an abstraction from specific instances. Rather, they simply store all the data, and at query time derive an answer from an examination of the query’s nearest neighbor(s).
Somewhat more generally, instance-based learning can refer to a class of procedures for solving new problems based on the solutions of similar past problems.
- QUOTE: Instance-based learning refers to a family of techniques for classification and regression, which produce a class label/predication based on the similarity of the query to its nearest neighbor(s) in the training set. In explicit contrast to other methods such as decision trees and neural networks, instance-based learning algorithms do not create an abstraction from specific instances. Rather, they simply store all the data, and at query time derive an answer from an examination of the query’s nearest neighbor(s).
1999
- (Melli, 1999a) ⇒ Gabor Melli. (1999). “A Lazy Model-based Algorithm for On-Line Classification.” In: Proceedings of PKDD-1999.
1997
- (Mitchell, 1997) ⇒ Tom M. Mitchell. (1997). “Machine Learning." McGraw-Hill.
1996
- (Domingos, 1996) ⇒ Pedro Domingos. (1996). “Unifying Instance-based and Rule-based Induction.” In: Machine Learning, 24(2). doi:10.1023/A:1018006431188
- QUOTE: Instance-based learning (Cover & Hart, 1967; Duda & Hart, 1973; Aha et al., 1991; Cost & Salzberg, 1993; Aha, in press) is founded on a direct application of the similarity assumption. In the simplest case, learning is performed by storing all the observed examples. A new example (or “test case”) is classified by finding the nearest stored example according to some similarity function, and assigning the latter’s class to the former. The stored examples used to classify new cases are referred to as instances or exemplars.
1993
- (Quinlan, 1993) ⇒ J. Ross Quinlan. (1993). “Combining Instance-based and Model-based Learning.” In: Proceedings of the Tenth International Conference on Machine Learning.
- (Cost & Salzberg, 1993) ⇒ S. Cost, and S. Salzberg. (1993). “A Weighted Nearest Neighbor Algorithm for Learning with Symbolic Features.” In: Machine Learning, 10.
1991
- (Aha et al., 1991) ⇒ David W. Aha, Dennis Kibler, and Marc K. Albert. (1991). “Instance-based Learning Algorithms.” In: Machine Learning, 6(1). doi:10.1023/A:1022689900470.
1967
- (Cover & Hart, 1967) ⇒ T. M. Cover, and P. E. Hart. (1967). “Nearest Neighbor Pattern Classification.” In: IEEE Transactions on Information Theory, 13.