Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks
Charith Peris, Lizhen Tan, Thomas Gueudre, Turan Gojayev, Pan Wei, Gokmen Oz
Abstract
Teacher-student knowledge distillation is a popular technique for compressing today’s prevailing large language models into manageable sizes that fit low-latency downstream applications. Both the teacher and the choice of transfer set used for distillation are crucial ingredients in creating a high quality student. Yet, the generic corpora used to pretrain the teacher and the corpora associated with the downstream target domain are often significantly different, which raises a natural question: should the student be distilled over the generic corpora, so as to learn from high-quality teacher predictions, or over the downstream task corpora to align with finetuning? Our study investigates this trade-off using Domain Classification (DC) and Intent Classification/Named Entity Recognition (ICNER) as downstream tasks. We distill several multilingual students from a larger multilingual LM with varying proportions of generic and task-specific datasets, and report their performance after finetuning on DC and ICNER. We observe significant improvements across tasks and test sets when only task-specific corpora is used. We also report on how the impact of adding task-specific data to the transfer set correlates with the similarity between generic and task-specific data. Our results clearly indicate that, while distillation from a generic LM benefits downstream tasks, students learn better using target domain data even if it comes at the price of noisier teacher predictions. In other words, target domain data still trumps teacher knowledge.- Anthology ID:
- 2022.emnlp-industry.12
- Volume:
- Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track
- Month:
- December
- Year:
- 2022
- Address:
- Abu Dhabi, UAE
- Editors:
- Yunyao Li, Angeliki Lazaridou
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 128–137
- Language:
- URL:
- https://aclanthology.org/2022.emnlp-industry.12
- DOI:
- 10.18653/v1/2022.emnlp-industry.12
- Cite (ACL):
- Charith Peris, Lizhen Tan, Thomas Gueudre, Turan Gojayev, Pan Wei, and Gokmen Oz. 2022. Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 128–137, Abu Dhabi, UAE. Association for Computational Linguistics.
- Cite (Informal):
- Knowledge Distillation Transfer Sets and their Impact on Downstream NLU Tasks (Peris et al., EMNLP 2022)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2022.emnlp-industry.12.pdf