Parents:
 
Knowledge distillation
Knowledge distillation (or teacher-student learning) is a compression technique in which a small model is trained to reproduce the behavior of a larger model (or an ensemble of models).
Related Tags:
 
ExpandDescendants
1 Documents (Long List
Properties