Abstract
Conventional knowledge distillation (KD) methods require access to the internal information of teachers, e.g., logits. However, such information may not always be accessible for large pre-trained language models (PLMs). In this work, we focus on decision-based KD for PLMs, where only teacher decisions (i.e., top-1 labels) are accessible. Considering the information gap between logits and decisions, we propose a novel method to estimate logits from the decision distributions. Specifically, decision distributions can be both derived as a function of logits theoretically and estimated with test-time data augmentation empirically. By combining the theoretical and empirical estimations of the decision distributions together, the estimation of logits can be successfully reduced to a simple root-finding problem. Extensive experiments show that our method significantly outperforms strong baselines on both natural language understanding and machine reading comprehension datasets.- Anthology ID:
- 2023.acl-long.738
- Volume:
- Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- ACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 13234–13248
- Language:
- URL:
- https://aclanthology.org/2023.acl-long.738
- DOI:
- 10.18653/v1/2023.acl-long.738
- Cite (ACL):
- Qinhong Zhou, Zonghan Yang, Peng Li, and Yang Liu. 2023. Bridging the Gap between Decision and Logits in Decision-based Knowledge Distillation for Pre-trained Language Models. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 13234–13248, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Bridging the Gap between Decision and Logits in Decision-based Knowledge Distillation for Pre-trained Language Models (Zhou et al., ACL 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.acl-long.738.pdf