Neural Network-based Language Model (LM) Training System
Jump to navigation
Jump to search
A Neural Network-based Language Model (LM) Training System is an LM training system that implements a neural LM training algorithm.
- Context:
- It can range from being a Neural Word-level LM System to being a Neural Character-level LM System.
- It can range from being a Shallow Neural LM System to being a Deep Neural LM System.
- …
- Example(s):
- an RNN-based LM System, such as an LSTM-based LM system.
- a Convolutional NNet-based LM System.
- a Neural Transformer-based LM System.
- a Python-based Neural LM System, such as a Pytorch-based LM Training System.
- …
- Counter-Example(s):
- See: Neural Language Generation System.
References
2018
- https://github.com/PetrochukM/PyTorch-NLP/tree/master/examples/awd-lstm-lm
- QUOTE: awd-lstm-lm set the state-of-the-art in word level perplexities in 2017. With PyTorch NLP, we show that in 30 minutes, we were able to reduce the footprint of this repository by 4 files (185 lines of code). We employ the use of the datasets package, IdentityEncoder module, BPTTBatchSampler module, LockedDropout module and WeightDrop module.