GPT (Generative Pre-trained Transformer) Architecture
Jump to navigation
Jump to search
A GPT (Generative Pre-trained Transformer) Architecture is a transformer-based decoder-only autoregressive sequential data model architecture.
- Context:
- Example(s):
- as defined in ...
- …
- Counter-Example(s):
- See: LLM Architecture.