Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection

Chenglong Wang, Yi Lu, Yongyu Mu, Yimin Hu, Tong Xiao, Jingbo Zhu


Abstract
Knowledge distillation addresses the problem of transferring knowledge from a teacher model to a student model.In this process, we typically have multiple types of knowledge extracted from the teacher model.The problem is to make full use of them to train the student model.Our preliminary study shows that: (1) not all of the knowledge is necessary for learning a good student model, and (2) knowledge distillation can benefit from certain knowledge at different training steps.In response to these, we propose an actor-critic approach to selecting appropriate knowledge to transfer during the process of knowledge distillation.In addition, we offer a refinement of the training algorithm to ease the computational burden.Experimental results on the GLUE datasets show that our method outperforms several strong knowledge distillation baselines significantly.
Anthology ID:
2022.findings-emnlp.464
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6232–6244
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.464
DOI:
10.18653/v1/2022.findings-emnlp.464
Bibkey:
Cite (ACL):
Chenglong Wang, Yi Lu, Yongyu Mu, Yimin Hu, Tong Xiao, and Jingbo Zhu. 2022. Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6232–6244, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Improved Knowledge Distillation for Pre-trained Language Models via Knowledge Selection (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2022.findings-emnlp.464.pdf
Video:
 https://preview.aclanthology.org/ingest-2024-clasp/2022.findings-emnlp.464.mp4