Abstract
Pretrained multilingual Transformers have achieved great success in cross-lingual transfer learning. Current methods typically activate the cross-lingual transferability of multilingual Transformers by fine-tuning them on end-task data. However, the methods cannot perform cross-lingual transfer when end-task data are unavailable. In this work, we explore whether the cross-lingual transferability can be activated without end-task data. We propose a cross-lingual transfer method, named PlugIn-X. PlugIn-X disassembles monolingual and multilingual Transformers into sub-modules, and reassembles them to be the multilingual end-task model. After representation adaptation, PlugIn-X finally performs cross-lingual transfer in a plug-and-play style. Experimental results show that PlugIn-X successfully activates the cross-lingual transferability of multilingual Transformers without accessing end-task data. Moreover, we analyze how the cross-model representation alignment affects the cross-lingual transferability.- Anthology ID:
- 2023.findings-acl.796
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 12572–12584
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.796
- DOI:
- 10.18653/v1/2023.findings-acl.796
- Cite (ACL):
- Zewen Chi, Heyan Huang, and Xian-Ling Mao. 2023. Can Cross-Lingual Transferability of Multilingual Transformers Be Activated Without End-Task Data?. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12572–12584, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Can Cross-Lingual Transferability of Multilingual Transformers Be Activated Without End-Task Data? (Chi et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2023.findings-acl.796.pdf