Distilled Large Language Model

From GM-RKB
Jump to navigation Jump to search

A Distilled Large Language Model is a neural language model that uses knowledge distillation to transfer learning and capabilities from a larger teacher model to a smaller, more efficient student model.