Neural Transformer-based Network Training System
Jump to navigation
Jump to search
A Neural Transformer-based Network Training System is a neural network training system that implements a neural transformer training algorithm to solve a neural transformer training task (to train a neural transformer model).
- Context:
- It can range from being a Uni-Directional Transformer Training System to being a Bi-Directional Transfomer Training System.
- Example(s):
- a Neural Transformer Language Model Training System.
- one based on
Trax
[1] - one based on
huggingface/transformers
[2] - one based on
tensorflow/tensor2tensor
[3] (defunct). - …
- Counter-Example(s):
- See: TensorFlow RNN.
References
2020
- https://github.com/huggingface/transformers
- QUOTE: Transformers (formerly known aspytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch.