Sequence-Member Tagging Task
(Redirected from Sequence Item Classification Task)
Jump to navigation
Jump to search
A sequence-member tagging task is a sequential data point classification task that is restricted to classifying members of a string structure with a tag from a tag set.
- AKA: String Labeling, Sequence Item Classification, Sequence Labeling Task.
- Context:
- Input:
- output: a Tagged Sequence.
- It can be solved by a String Tagging System(a Tagger) that implements a String Tagging Algorithm.
- It can range from being a Supervised Sequence-Member Tagging Task to being an Unsupervised Sequence-Member Tagging Task.
- It can be used to solve a Sequence Segmentation Task, if there are few segment types. (Sun et al., 2008)
- Example(s):
- any Text Token Classification Task.
- a Part-of-Speech Tagging Task, where each token is mapped to a Part-of-Speech Role.
- a Text Chunking Task, where each token is mapped to a Tag that indicates the start and end of a Subsequence (and ensures that invalid tag orders are not generated).
- a Data Stream Tagging Task/Data Stream Classification Task.
- a Constrained Sequential Labeling (CSL) Task (Chen et al., 2014).
- a Sequential Labeling with Latent Variables Task.
- …
- any Text Token Classification Task.
- Counter-Example(s):
- a Sequence Segmentation Task.
- a Sequence Classification Task, (maps a sequence item to a sequence category), such as: a Named Entity Mention Classification Task.
- See: Data Stream Mining Task, Online Learning Task.
References
2014
- (Chen et al., 2014) ⇒ Sheng Chen, Alan Fern, and Sinisa Todorovic (2014). "Multi-Object Tracking via Constrained Sequential Labeling". In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition.
- QUOTE: In this work, rather than heuristically simplify the mid-level labeling problem to an efficiently solvable framework, we instead develop a greedy labeling approach that can operate directly on the original (unsimplified) problem. In particular, our approach conducts tracking by sequentially labeling mid-level video features with object identifiers, under hard constraints. Our new algorithm, called constrained sequential labeling (CSL), uses a flexible cost function to sequentially assign labels while directly respecting the implications of hard constraints (e.g., a person cannot be at two distinct locations)
2010
- (Wu et al., 2010) ⇒ Yu-Chieh Wu, Yue-Shi Lee, Jie-Chi Yang, and Show-Jane Yen (2010, December). "An Integrated Deterministic and Nondeterministic Inference Algorithm for Sequential Labeling". In: Asia Information Retrieval Symposium. DOI:10.1007/978-3-642-17187-1_21
- QUOTE: In this paper, we present a new search algorithm for sequential labeling tasks based on the conditional Markov models (CMMs) frameworks.
2009
- (Sun & Tsujii, 2009) ⇒ Xu Sun, and Junichi Tsujii (2009, March). "Sequential Labeling with Latent Variables: An Exact Inference Algorithm and its Efficient Approximation". In: Proceedings of the 12th Conference of the European Chapter of the ACL (EACL 2009).
2003
- (Sha & Pereira, 2003a) ⇒ Fei Sha, and Fernando Pereira. (2003). “Shallow Parsing with Conditional Random Fields.” In: Proceedings of the 2003 Conference of the North American Chapter of the Association for Computational Linguistics on Human Language Technology (HLT-NAACL 2003). doi:10.3115/1073445.1073473
- QUOTE: Sequence analysis tasks in language and biology are often described as mappings from input sequences to sequences of labels encoding the analysis. In language processing, examples of such tasks include part-of-speech tagging, named-entity recognition, and the task we shall focus on here, shallow parsing.
2002
- (Collins, 2002a) ⇒ Michael Collins. (2002). “Ranking Algorithms for Named–Entity Extraction: Boosting and the voted perceptron.” In: Proceedings of the ACL Conference (ACL 2002).
- QUOTE: The problem can be framed as a tagging task – to tag each word as being either the start of an entity, a continuation of an entity, or not to be part of an entity at all (we will use the tags S, C and N respectively for these three cases).