Knowledge distillation ; Pre-Trained Language Models AND Yves Peirsman
Common descendants