Consistency Regularization for Cross-Lingual Fine-Tuning

Bo Zheng, Li Dong, Shaohan Huang, Wenhui Wang, Zewen Chi, Saksham Singhal, Wanxiang Che, Ting Liu, Xia Song, Furu Wei


Abstract
Fine-tuning pre-trained cross-lingual language models can transfer task-specific supervision from one language to the others. In this work, we propose to improve cross-lingual fine-tuning with consistency regularization. Specifically, we use example consistency regularization to penalize the prediction sensitivity to four types of data augmentations, i.e., subword sampling, Gaussian noise, code-switch substitution, and machine translation. In addition, we employ model consistency to regularize the models trained with two augmented versions of the same training set. Experimental results on the XTREME benchmark show that our method significantly improves cross-lingual fine-tuning across various tasks, including text classification, question answering, and sequence labeling.
Anthology ID:
2021.acl-long.264
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3403–3417
Language:
URL:
https://aclanthology.org/2021.acl-long.264
DOI:
10.18653/v1/2021.acl-long.264
Bibkey:
Cite (ACL):
Bo Zheng, Li Dong, Shaohan Huang, Wenhui Wang, Zewen Chi, Saksham Singhal, Wanxiang Che, Ting Liu, Xia Song, and Furu Wei. 2021. Consistency Regularization for Cross-Lingual Fine-Tuning. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3403–3417, Online. Association for Computational Linguistics.
Cite (Informal):
Consistency Regularization for Cross-Lingual Fine-Tuning (Zheng et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2021.acl-long.264.pdf
Video:
 https://preview.aclanthology.org/landing_page/2021.acl-long.264.mp4
Code
 bozheng-hit/xTune
Data
MLQAPAWS-XTyDiQATyDiQA-GoldPXNLIXQuAD