Abstract
Cross-lingual transfer has recently been tackled through modular, parameter-efficient fine-tuning methods which allow arbitrary combinations of language and task modules for transfer of any task to any language. Concurrently, task arithmetic has emerged as a powerful and modular tool for editing pretrained models using multiple full fine-tunings. In this work, we connect the paradigms of task arithmetic and cross-lingual transfer, demonstrating that modularity for cross-lingual transfer can be achieved even with full model fine-tuning. Our approach displays strong performance on a range of multilingual benchmarks encompassing both high-resource and low-resource languages.- Anthology ID:
- 2024.eacl-short.12
- Volume:
- Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
- Month:
- March
- Year:
- 2024
- Address:
- St. Julian’s, Malta
- Editors:
- Yvette Graham, Matthew Purver
- Venue:
- EACL
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 124–137
- Language:
- URL:
- https://aclanthology.org/2024.eacl-short.12
- DOI:
- Cite (ACL):
- Marinela Parović, Ivan Vulić, and Anna Korhonen. 2024. Investigating the Potential of Task Arithmetic for Cross-Lingual Transfer. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 124–137, St. Julian’s, Malta. Association for Computational Linguistics.
- Cite (Informal):
- Investigating the Potential of Task Arithmetic for Cross-Lingual Transfer (Parović et al., EACL 2024)
- PDF:
- https://preview.aclanthology.org/naacl-24-ws-corrections/2024.eacl-short.12.pdf