Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching

Kunbo Ding, Weijie Liu, Yuejian Fang, Zhe Zhao, Qi Ju, Xuefeng Yang, Rong Tian, Zhu Tao, Haoyan Liu, Han Guo, Xingyu Bai, Weiquan Mao, Yudong Li, Weigang Guo, Taiqiang Wu, Ningyuan Sun


Abstract
Previous studies have proved that cross-lingual knowledge distillation can significantly improve the performance of pre-trained models for cross-lingual similarity matching tasks. However, the student model needs to be large in this operation. Otherwise, its performance will drop sharply, thus making it impractical to be deployed to memory-limited devices. To address this issue, we delve into cross-lingual knowledge distillation and propose a multi-stage distillation framework for constructing a small-size but high-performance cross-lingual model. In our framework, contrastive learning, bottleneck, and parameter recurrent strategies are delicately combined to prevent performance from being compromised during the compression process. The experimental results demonstrate that our method can compress the size of XLM-R and MiniLM by more than 50%, while the performance is only reduced by about 1%.
Anthology ID:
2022.findings-naacl.167
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2171–2181
Language:
URL:
https://aclanthology.org/2022.findings-naacl.167
DOI:
10.18653/v1/2022.findings-naacl.167
Bibkey:
Cite (ACL):
Kunbo Ding, Weijie Liu, Yuejian Fang, Zhe Zhao, Qi Ju, Xuefeng Yang, Rong Tian, Zhu Tao, Haoyan Liu, Han Guo, Xingyu Bai, Weiquan Mao, Yudong Li, Weigang Guo, Taiqiang Wu, and Ningyuan Sun. 2022. Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2171–2181, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Multi-stage Distillation Framework for Cross-Lingual Semantic Similarity Matching (Ding et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.167.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2022.findings-naacl.167.mp4
Code
 KB-Ding/Multi-stage-Distillaton-Framework