AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation

Junjie Wang, Yicheng Chen, Wangshu Zhang, Sen Hu, Teng Xu, Jing Zheng


Abstract
Leveraging knowledge from multiple tasks through introducing a small number of task specific parameters into each transformer layer, also known as adapters, receives much attention recently. However, adding an extra fusion layer to implement knowledge composition not only increases the inference time but also is non-scalable for some applications. To avoid these issues, we propose a two-stage knowledge distillation algorithm called AdapterDistillation. In the first stage, we extract task specific knowledge by using local data to train a student adapter. In the second stage, we distill the knowledge from the existing teacher adapters into the student adapter to help its inference. Extensive experiments on frequently asked question retrieval in task-oriented dialog systems validate the efficiency of AdapterDistillation. We show that AdapterDistillation outperforms existing algorithms in terms of accuracy, resource consumption and inference time.
Anthology ID:
2023.emnlp-industry.20
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
December
Year:
2023
Address:
Singapore
Editors:
Mingxuan Wang, Imed Zitouni
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
194–201
Language:
URL:
https://aclanthology.org/2023.emnlp-industry.20
DOI:
10.18653/v1/2023.emnlp-industry.20
Bibkey:
Cite (ACL):
Junjie Wang, Yicheng Chen, Wangshu Zhang, Sen Hu, Teng Xu, and Jing Zheng. 2023. AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 194–201, Singapore. Association for Computational Linguistics.
Cite (Informal):
AdapterDistillation: Non-Destructive Task Composition with Knowledge Distillation (Wang et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-industry.20.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/2023.emnlp-industry.20.mp4