SageMaker-based Neural Sequence-to-Sequence (seq2seq)-based Model Training System
Jump to navigation
Jump to search
A SageMaker-based Neural Sequence-to-Sequence (seq2seq)-based Model Training System is a neural seq2seq training system that is a SageMaker-based training system.
- Example(s):
- …
- Counter-Example(s):
- See: AWS SageMaker.
References
2018
- https://docs.aws.amazon.com/sagemaker/latest/dg/seq-2-seq.html
- QUOTE: ... Amazon SageMaker Sequence to Sequence is a supervised learning algorithm where the input is a sequence of tokens (for example, text, audio) and the output generated is another sequence of tokens. Example applications include: machine translation (input a sentence from one language and predict what that sentence would be in another language), text summarization (input a longer string of words and predict a shorter string of words that is a summary), speech-to-text (audio clips converted into output sentences in tokens). Recently, problems in this domain have been successfully modeled with deep neural networks that show a significant performance boost over previous methodologies. Amazon SageMaker seq2seq uses Recurrent Neural Networks (RNNs) and Convolutional Neural Network (CNN) models with attention as encoder-decoder architectures. ...