Encoder-Only Transformer Model
Jump to navigation
Jump to search
A Encoder-Only Transformer Model is a transformer model that consists solely of an encoder architecture.
- Context:
- It can (typically) be responsible for encoding Input Sequences into Continuous Representations.
- …
- Example(s):
- an Encoder-Only Transformer-Based Language Model, such as: BERT Model, RoBERTa Model, ALBERT Model, XLM Model.
- …
- Counter-Example(s):
- See: Encoder/Decoder Transformer Model.
References
2023
- chat
- An Encoder-Only Transformer Model consists solely of an encoder architecture. This model is responsible for encoding input sequences into continuous representations, which can be used for different NLP tasks, including text classification, sentiment analysis, and named entity recognition A well-known example of an Encoder-Only Transformer Model is the BERT (Bidirectional Encoder Representations from Transformers) model, developed by Google AI.