Meta-Learning Adaptive Knowledge Distillation for Efficient Biomedical Natural Language Processing

Abiola Obamuyide, Blair Johnston


Abstract
There has been an increase in the number of large and high-performing models made available for various biomedical natural language processing tasks. While these models have demonstrated impressive performance on various biomedical tasks, their training and run-time costs can be computationally prohibitive. This work investigates the use of knowledge distillation, a common model compression method, to reduce the size of large models for biomedical natural language processing. We further improve the performance of knowledge distillation methods for biomedical natural language by proposing a meta-learning approach which adaptively learns parameters that enable the optimal rate of knowledge exchange between the teacher and student models from the distillation data during knowledge distillation. Experiments on two biomedical natural language processing tasks demonstrate that our proposed adaptive meta-learning approach to knowledge distillation delivers improved predictive performance over previous and recent state-of-the-art knowledge distillation methods.
Anthology ID:
2022.findings-aacl.12
Volume:
Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022
Month:
November
Year:
2022
Address:
Online only
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
131–137
Language:
URL:
https://aclanthology.org/2022.findings-aacl.12
DOI:
Bibkey:
Cite (ACL):
Abiola Obamuyide and Blair Johnston. 2022. Meta-Learning Adaptive Knowledge Distillation for Efficient Biomedical Natural Language Processing. In Findings of the Association for Computational Linguistics: AACL-IJCNLP 2022, pages 131–137, Online only. Association for Computational Linguistics.
Cite (Informal):
Meta-Learning Adaptive Knowledge Distillation for Efficient Biomedical Natural Language Processing (Obamuyide & Johnston, Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.findings-aacl.12.pdf