RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation

Md Akmal Haidar, Nithin Anchuri, Mehdi Rezagholizadeh, Abbas Ghaddar, Philippe Langlais, Pascal Poupart


Abstract
Intermediate layer knowledge distillation (KD) can improve the standard KD technique (which only targets the output of teacher and student models) especially over large pre-trained language models. However, intermediate layer distillation suffers from excessive computational burdens and engineering efforts required for setting up a proper layer mapping. To address these problems, we propose a RAndom Intermediate Layer Knowledge Distillation (RAIL-KD) approach in which, intermediate layers from the teacher model are selected randomly to be distilled into the intermediate layers of the student model. This randomized selection enforces that all teacher layers are taken into account in the training process, while reducing the computational cost of intermediate layer distillation. Also, we show that it acts as a regularizer for improving the generalizability of the student model. We perform extensive experiments on GLUE tasks as well as on out-of-domain test sets. We show that our proposed RAIL-KD approach outperforms other state-of-the-art intermediate layer KD methods considerably in both performance and training-time.
Anthology ID:
2022.findings-naacl.103
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1389–1400
Language:
URL:
https://aclanthology.org/2022.findings-naacl.103
DOI:
10.18653/v1/2022.findings-naacl.103
Bibkey:
Cite (ACL):
Md Akmal Haidar, Nithin Anchuri, Mehdi Rezagholizadeh, Abbas Ghaddar, Philippe Langlais, and Pascal Poupart. 2022. RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1389–1400, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation (Haidar et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.103.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.103.mp4
Data
GLUEIMDb Movie ReviewsPAWSQNLI