Abstract
Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.- Anthology ID:
- 2020.aacl-main.2
- Volume:
- Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
- Month:
- December
- Year:
- 2020
- Address:
- Suzhou, China
- Venue:
- AACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12–17
- Language:
- URL:
- https://aclanthology.org/2020.aacl-main.2
- DOI:
- Cite (ACL):
- Zewen Chi, Li Dong, Furu Wei, Xianling Mao, and Heyan Huang. 2020. Can Monolingual Pretrained Models Help Cross-Lingual Classification?. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 12–17, Suzhou, China. Association for Computational Linguistics.
- Cite (Informal):
- Can Monolingual Pretrained Models Help Cross-Lingual Classification? (Chi et al., AACL 2020)
- PDF:
- https://preview.aclanthology.org/auto-file-uploads/2020.aacl-main.2.pdf