Neural Transformer-based Language Model (LM) Training System
Jump to navigation
Jump to search
A Neural Transformer-based Language Model (LM) Training System is a neural LM training system that is a transformer model training system that implements a transformer-based LM training algorithm (to produce an transformer-based LM).
- Example(s):
- a BERT LM Training System.
- an GPT-2 Transformer-based LM Training System [1].
- SensEmBERT System,
- ARES System,
- DeBERTa System [2].
- RoBERTa System;
- GloVe System,
- ESIM System,
- ELMo System,
- OpenAI GPT System,
- Semi-supervised Sequence Learning System,
- Universal Language Model Fine-tuning for Text Classification (ULMFiT) System.
- Liu's Multi-Task Deep Neural Network NLU System (Liu et al., 2019).
- …
- Counter-Example(s):
- See: Transformer-based LM Training Algorithm.