MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation

Ahmad Rashid, Vasileios Lioutas, Mehdi Rezagholizadeh


Abstract
The advent of large pre-trained language models has given rise to rapid progress in the field of Natural Language Processing (NLP). While the performance of these models on standard benchmarks has scaled with size, compression techniques such as knowledge distillation have been key in making them practical. We present MATE-KD, a novel text-based adversarial training algorithm which improves the performance of knowledge distillation. MATE-KD first trains a masked language model-based generator to perturb text by maximizing the divergence between teacher and student logits. Then using knowledge distillation a student is trained on both the original and the perturbed training samples. We evaluate our algorithm, using BERT-based models, on the GLUE benchmark and demonstrate that MATE-KD outperforms competitive adversarial learning and data augmentation baselines. On the GLUE test set our 6 layer RoBERTa based model outperforms BERT-large.
Anthology ID:
2021.acl-long.86
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1062–1071
Language:
URL:
https://aclanthology.org/2021.acl-long.86
DOI:
10.18653/v1/2021.acl-long.86
Bibkey:
Cite (ACL):
Ahmad Rashid, Vasileios Lioutas, and Mehdi Rezagholizadeh. 2021. MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 1062–1071, Online. Association for Computational Linguistics.
Cite (Informal):
MATE-KD: Masked Adversarial TExt, a Companion to Knowledge Distillation (Rashid et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.86.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2021.acl-long.86.mp4
Data
GLUEPAWSQNLI