Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer
Ziqing Yang, Wentao Ma, Yiming Cui, Jiani Ye, Wanxiang Che, Shijin Wang
Abstract
Multilingual pre-trained models have achieved remarkable performance on cross-lingual transfer learning. Some multilingual models such as mBERT, have been pre-trained on unlabeled corpora, therefore the embeddings of different languages in the models may not be aligned very well. In this paper, we aim to improve the zero-shot cross-lingual transfer performance by proposing a pre-training task named Word-Exchange Aligning Model (WEAM), which uses the statistical alignment information as the prior knowledge to guide cross-lingual word prediction. We evaluate our model on multilingual machine reading comprehension task MLQA and natural language interface task XNLI. The results show that WEAM can significantly improve the zero-shot performance.- Anthology ID:
- 2021.mrqa-1.10
- Volume:
- Proceedings of the 3rd Workshop on Machine Reading for Question Answering
- Month:
- November
- Year:
- 2021
- Address:
- Punta Cana, Dominican Republic
- Editors:
- Adam Fisch, Alon Talmor, Danqi Chen, Eunsol Choi, Minjoon Seo, Patrick Lewis, Robin Jia, Sewon Min
- Venue:
- MRQA
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 100–105
- Language:
- URL:
- https://aclanthology.org/2021.mrqa-1.10
- DOI:
- 10.18653/v1/2021.mrqa-1.10
- Cite (ACL):
- Ziqing Yang, Wentao Ma, Yiming Cui, Jiani Ye, Wanxiang Che, and Shijin Wang. 2021. Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer. In Proceedings of the 3rd Workshop on Machine Reading for Question Answering, pages 100–105, Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Bilingual Alignment Pre-Training for Zero-Shot Cross-Lingual Transfer (Yang et al., MRQA 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2021.mrqa-1.10.pdf