Human Annotation Process Model
(Redirected from Human Annotation Workflow)
Jump to navigation
Jump to search
A Human Annotation Process Model is an annotation process model that involves human annotators performing manual annotation tasks.
- Context:
- It can (typically) involve Annotation Guidelines Creation and Annotator Training.
- It can (often) require Quality Control Measures and Inter-Annotator Agreement.
- ...
- It can range from being a Simple Annotation Workflow (e.g., binary labeling) to being a Complex Annotation Workflow (e.g., multi-stage annotation).
- It can range from being a Single-Annotator Workflow to being a Multi-Annotator Workflow.
- ...
- It can be supported by an Annotation Management System.
- It can include Annotation Quality Assessment.
- It can involve Annotator Performance Monitoring.
- ...
- Example(s):
- Text Corpus Annotation Workflows, which create labeled datasets.
- Content Moderation Workflows, which assess content appropriateness.
- Dataset Labeling Workflows, which prepare training data.
- Document Classification Workflows, which organize text collections.
- Entity Annotation Workflows, which mark named entities.
- ...
- Counter-Example(s):
- Automated Annotation Workflows, which use algorithms.
- Machine Learning Pipelines, which process data automatically.
- Data Collection Workflows, which gather raw data.
- Data Processing Workflows, which transform data.
- See: Annotation Management, Human-in-the-Loop Process, Data Labeling System.