Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method
Shicheng Tan, Weng Lam Tam, Yuanchun Wang, Wenwen Gong, Shu Zhao, Peng Zhang, Jie Tang
Abstract
The large scale of pre-trained language models poses a challenge for their deployment on various devices, with a growing emphasis on methods to compress these models, particularly knowledge distillation. However, current knowledge distillation methods rely on the model’s intermediate layer features and the golden labels (also called hard labels), which usually require aligned model architecture and enough labeled data respectively. Moreover, the parameters of vocabulary are usually neglected in existing methods. To address these problems, we propose a general language model distillation (GLMD) method that performs two-stage word prediction distillation and vocabulary compression, which is simple and surprisingly shows extremely strong performance. Specifically, GLMD supports more general application scenarios by eliminating the constraints of dimension and structure between models and the need for labeled datasets through the absence of intermediate layers and golden labels. Meanwhile, based on the long-tailed distribution of word frequencies in the data, GLMD designs a strategy of vocabulary compression through decreasing vocabulary size instead of dimensionality. Experimental results show that our method outperforms 25 state-of-the-art methods on the SuperGLUE benchmark, achieving an average score that surpasses the best method by 3%.- Anthology ID:
- 2023.findings-acl.614
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 9678–9696
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.614
- DOI:
- 10.18653/v1/2023.findings-acl.614
- Cite (ACL):
- Shicheng Tan, Weng Lam Tam, Yuanchun Wang, Wenwen Gong, Shu Zhao, Peng Zhang, and Jie Tang. 2023. Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method. In Findings of the Association for Computational Linguistics: ACL 2023, pages 9678–9696, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Are Intermediate Layers and Labels Really Necessary? A General Language Model Distillation Method (Tan et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/naacl24-info/2023.findings-acl.614.pdf