Knowledge distillation
ExpandDescendants
14 Documents (Long List
[1910.01348] On the Efficacy of Knowledge Distillation(About)
2020-06-06
[1804.03235] Large scale distributed neural network training through online distillation(About)
2020-06-06
[1511.03643] Unifying distillation and privileged information(About)
2020-05-31
[1907.04829] BAM! Born-Again Multi-Task Networks for Natural Language Understanding(About)
2020-05-12
[1912.08422] Distilling Structured Knowledge into Embeddings for Explainable and Accurate Recommendation(About)
2020-05-12
[1706.00384] Deep Mutual Learning(About)
2020-05-11
Knowledge Distillation - Neural Network Distiller(About)
2020-04-22
Turning Up the Heat: The Mechanics of Model Distillation(About)
2020-04-22
[1503.02531] Distilling the Knowledge in a Neural Network(About)
2020-04-16
[2002.02925] BERT-of-Theseus: Compressing BERT by Progressive Module Replacing(About)
2020-02-10
Properties