Pre-Trained General-Purpose Neural Model
Jump to navigation
Jump to search
A Pre-Trained General-Purpose Neural Model is a pre-trained neural model that is a general-purpose neural model.
- Context:
- It can (typically) support multiple tasks across various domains without task-specific training.
- It can range from being a Base Pre-Trained General-Purpose Neural Model to being a Fine-Tuned Pre-Trained General-Purpose Neural Model.
- It can range from being a Pre-Trained General-Purpose Neural Encoder-only Model to being a ____ Model to being a ____ Model.
- It can range from being a Single Datatype Pre-Trained General-Purpose Neural Model to being a multimodal neural model.
- ...
- Example(s):
- Base Encoder-Only Text Model], such as:
- BERT, RoBERTa, ALBERT, DistilBERT.
- ...
- Pre-Trained General-Purpose LLMs, such as:
- Foundational Base LLM, such as GPT-3
- Foundational Instruct LLM, such as LLaMA-Instruct model.
- ...
- Foundational Computer Vision Models, such as:
- Foundational Basic CV Model, such as: ResNet, a convolutional neural network model for image recognition.
- Foundational Object Detection Model, such as: YOLO, a real-time object detection system.
- ...
- Base Encoder-Only Text Model], such as:
- Counter-Example(s):
- Task-Specific Neural Models, which are designed for specific tasks without the versatility of general-purpose models.
- See: Large Language Models, Multimodal Models, Computer Vision Models