Model Distillation Method

From GM-RKB
Jump to navigation Jump to search

A Model Distillation Method is a machine learning transfer method that transfers knowledge and capabilities from a larger teacher model to a smaller student model while preserving key performance characteristics.