TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing

Ziqing Yang, Yiming Cui, Zhipeng Chen, Wanxiang Che, Ting Liu, Shijin Wang, Guoping Hu


Abstract
In this paper, we introduce TextBrewer, an open-source knowledge distillation toolkit designed for natural language processing. It works with different neural network models and supports various kinds of supervised learning tasks, such as text classification, reading comprehension, sequence labeling. TextBrewer provides a simple and uniform workflow that enables quick setting up of distillation experiments with highly flexible configurations. It offers a set of predefined distillation methods and can be extended with custom code. As a case study, we use TextBrewer to distill BERT on several typical NLP tasks. With simple configurations, we achieve results that are comparable with or even higher than the public distilled BERT models with similar numbers of parameters.
Anthology ID:
2020.acl-demos.2
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations
Month:
July
Year:
2020
Address:
Online
Editors:
Asli Celikyilmaz, Tsung-Hsien Wen
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9–16
Language:
URL:
https://aclanthology.org/2020.acl-demos.2
DOI:
10.18653/v1/2020.acl-demos.2
Bibkey:
Cite (ACL):
Ziqing Yang, Yiming Cui, Zhipeng Chen, Wanxiang Che, Ting Liu, Shijin Wang, and Guoping Hu. 2020. TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 9–16, Online. Association for Computational Linguistics.
Cite (Informal):
TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing (Yang et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2020.acl-demos.2.pdf
Video:
 http://slideslive.com/38928594
Code
 airaria/TextBrewer
Data
CMRCCMRC 2018CoNLL 2003DRCDSQuADXNLI