Cross-lingual Transfer of Monolingual Models
Evangelia Gogoulou, Ariel Ekgren, Tim Isbister, Magnus Sahlgren
Abstract
Recent studies in cross-lingual learning using multilingual models have cast doubt on the previous hypothesis that shared vocabulary and joint pre-training are the keys to cross-lingual generalization. We introduce a method for transferring monolingual models to other languages through continuous pre-training and study the effects of such transfer from four different languages to English. Our experimental results on GLUE show that the transferred models outperform an English model trained from scratch, independently of the source language. After probing the model representations, we find that model knowledge from the source language enhances the learning of syntactic and semantic knowledge in English.- Anthology ID:
- 2022.lrec-1.100
- Volume:
- Proceedings of the Thirteenth Language Resources and Evaluation Conference
- Month:
- June
- Year:
- 2022
- Address:
- Marseille, France
- Editors:
- Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
- Venue:
- LREC
- SIG:
- Publisher:
- European Language Resources Association
- Note:
- Pages:
- 948–955
- Language:
- URL:
- https://aclanthology.org/2022.lrec-1.100
- DOI:
- Cite (ACL):
- Evangelia Gogoulou, Ariel Ekgren, Tim Isbister, and Magnus Sahlgren. 2022. Cross-lingual Transfer of Monolingual Models. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 948–955, Marseille, France. European Language Resources Association.
- Cite (Informal):
- Cross-lingual Transfer of Monolingual Models (Gogoulou et al., LREC 2022)
- PDF:
- https://preview.aclanthology.org/naacl-24-ws-corrections/2022.lrec-1.100.pdf
- Data
- GLUE