Word-Level Sequence-to-Sequence Modeling System
Jump to navigation
Jump to search
A Word-Level Sequence-to-Sequence Modeling System is a sequence-to-sequence modeling system that implements a [[word-level sequence-to-sequence modeling algorithm]] to solve a [[word-level sequence-to-sequence modeling task]].
- Context:
- …
- Example(s):
- Counter-Example(s):
- See: Character/Word-Level Hybrid Language Modeling System.
References
2017
- Sean Robertson. (2017). “Practical PyTorch: Translation with a Sequence to Sequence Network and Attention."
- QUOTE: In this project we will be teaching a neural network to translate from French to English. ... This is made possible by the simple but powerful idea of the sequence to sequence network, in which two recurrent neural networks work together to transform one sequence to another. An encoder network condenses an input sequence into a single vector, and a decoder network unfolds that vector into a new sequence.
To improve upon this model we'll use an attention mechanism, which lets the decoder learn to focus over a specific range of the input sequence.
- QUOTE: In this project we will be teaching a neural network to translate from French to English. ... This is made possible by the simple but powerful idea of the sequence to sequence network, in which two recurrent neural networks work together to transform one sequence to another. An encoder network condenses an input sequence into a single vector, and a decoder network unfolds that vector into a new sequence.