Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation

Yuxin Ren, Zihan Zhong, Xingjian Shi, Yi Zhu, Chun Yuan, Mu Li


Abstract
It has been commonly observed that a teacher model with superior performance does not necessarily result in a stronger student, highlighting a discrepancy between current teacher training practices and effective knowledge transfer. In order to enhance the guidance of the teacher training process, we introduce the concept of distillation influence to determine the impact of distillation from each training sample on the student’s generalization ability. In this paper, we propose Learning Good Teacher Matters (LGTM), an efficient training technique for incorporating distillation influence into the teacher’s learning process. By prioritizing samples that are likely to enhance the student’s generalization ability, our LGTM outperforms 10 common knowledge distillation baselines on 6 text classification tasks in the GLUE benchmark.
Anthology ID:
2023.acl-long.111
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1990–2006
Language:
URL:
https://aclanthology.org/2023.acl-long.111
DOI:
10.18653/v1/2023.acl-long.111
Bibkey:
Cite (ACL):
Yuxin Ren, Zihan Zhong, Xingjian Shi, Yi Zhu, Chun Yuan, and Mu Li. 2023. Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1990–2006, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Tailoring Instructions to Student’s Learning Levels Boosts Knowledge Distillation (Ren et al., ACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-long.111.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.acl-long.111.mp4