Supervised Categorical-Value Prediction Task
(Redirected from Classification Learning)
Jump to navigation
Jump to search
A Supervised Categorical-Value Prediction Task is a data-driven classification task that is a supervised learning task (that requires a labeled record dataset).
- AKA: Inductive Classification, Supervised Classification Task.
- Context:
- Input: a Labeled Record Dataset (with a target attribute that is a nominal attribute).
- output: one or more predicted class labels (for each testing record).
- Optional Output: a Learned Classification Function.
- Optional Output: a Confidence Score (for each value).
- It can range from being a Fully-Supervised Classification Task to being a Semi-Supervised Classification Task, depending on the presence of unlabeled learning records.
- It can range from being a Ranking Supervised Classification Task to being a Probabilistic Supervised Classification Task, depending on the requirement for a probability value.
- It can range from being a Supervised Two-Class Classification Task to being a Supervised Multi-Class Classification Task, depending on the class set size.
- It can range from being a Univariate Supervised Classification Task to being a Multivariate Supervised Classification Task, depending on the feature set size.
- It can range from being a Supervised One-Label Classification Task to being a Supervised Multi-Label Classification Task, depending on the number of class labels to predict.
- It can range from being an Online Supervised Classification Task to being an Offline Supervised Classification Task, depending on the incremental availability of data.
- It can range from being a IID Supervised Classification Task to being an Non-IID Supervised Classification Task (such as temporal supervised classification), depending on whether it receives IID data).
- It can range from being a Model-based Supervised Classification Task to being an Model-free Supervised Classification Task, depending on the requirement of a classification model.
- It can range from being a Large Sparse Data Classification Task to being ...
- It can be solved by a Supervised Classification System and implements a Supervised Classification Algorithm.
- Example(s):
- a Benchmark Supervised Classification Task, such as a UCI Census Data.
- a Logistic Regression Training Task, that requires a logistic function.
- a Classification Rule Learning Task.
- a Supervised Text Document Classification Task.
- a Supervised Spam Email Classification Task.
- a Supervised Optical Character Classification Task.
- a Classification Tree Learning Task.
- …
- Counter-Example(s):
- See: Simple Output Learning Task.
References
2000
- (Witten & Frank, 2000) ⇒ Ian H. Witten, and Eibe Frank. (2000). “Data Mining: Practical Machine Learning Tools and Techniques with Java implementations." Morgan Kaufmann.
- QUOTE: Four basically different styles of learning appear in data mining applications. In classification learning, a learning scheme takes a set of classified examples from which it is expected to learn a way of classifying unseen examples...
Classification learning is sometimes called supervised because, in a sense, the scheme operates under supervision by being provided with actual outcome for each of the training examples ...
- QUOTE: Four basically different styles of learning appear in data mining applications. In classification learning, a learning scheme takes a set of classified examples from which it is expected to learn a way of classifying unseen examples...
1996
- (Domingos, 1996) ⇒ Pedro Domingos. (1996). “Unifying Instance-based and Rule-based Induction.” In: Machine Learning, 24(2). doi:10.1023/A:1018006431188
- QUOTE: Inductive learning is the explicit or implicit creation of general concept or class descriptions from examples. Many induction problems It can be described as follows. A training set of preclassified examples is given, where each example (also called observation or case) is described by a vector of features or attribute values, and the goal is to form a description that can be used to classify previously unseen examples with high accuracy.