Semi-Supervised Sequence Learning Task
Jump to navigation
Jump to search
A Semi-Supervised Sequence Learning Task is a Sequence Learning Task that is based on a Semi-Supervised Learning Task.
- Context:
- It can be solved by a Semi-Supervised Sequence Learning System that implements a Semi-Supervised Sequence Learning Algorithm.
- Example(s):
- Counter-Example(s):
- See: Complex Input Classification Task, Sentiment Analysis, LSTM, Recurrent Neural Network, Associative Reinforcement Learning, Active Learning.
References
- (Dai & Le, 2015) ⇒ Andrew M. Dai, and Quoc V. Le. (2015). “Semi-supervised Sequence Learning.” In: Proceedings of the 28th International Conference on Neural Information Processing Systems - Volume 2.
- QUOTE: The first approach is to predict what comes next in a sequence, which is a language model in NLP. The second approach is to use a sequence autoencoder, which reads the input sequence into a vector and predicts the input sequence again. These two algorithms can be used as a " pretraining " algorithm for a later supervised sequence learning algorithm. In other words, the parameters obtained from the pretraining step can then be used as a starting point for other supervised training models.