Abstract
Large pretrained multilingual models, trained on dozens of languages, have delivered promising results due to cross-lingual learning capabilities on a variety of language tasks. Further adapting these models to specific languages, especially ones unseen during pre-training, is an important goal toward expanding the coverage of language technologies. In this study, we show how we can use language phylogenetic information to improve cross-lingual transfer leveraging closely related languages in a structured, linguistically-informed manner. We perform adapter-based training on languages from diverse language families (Germanic, Uralic, Tupian, Uto-Aztecan) and evaluate on both syntactic and semantic tasks, obtaining more than 20% relative performance improvements over strong commonly used baselines, especially on languages unseen during pre-training.- Anthology ID:
- 2022.aacl-main.34
- Volume:
- Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
- Month:
- November
- Year:
- 2022
- Address:
- Online only
- Editors:
- Yulan He, Heng Ji, Sujian Li, Yang Liu, Chua-Hui Chang
- Venues:
- AACL | IJCNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 434–452
- Language:
- URL:
- https://preview.aclanthology.org/icon-24-ingestion/2022.aacl-main.34/
- DOI:
- 10.18653/v1/2022.aacl-main.34
- Cite (ACL):
- Fahim Faisal and Antonios Anastasopoulos. 2022. Phylogeny-Inspired Adaptation of Multilingual Models to New Languages. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 434–452, Online only. Association for Computational Linguistics.
- Cite (Informal):
- Phylogeny-Inspired Adaptation of Multilingual Models to New Languages (Faisal & Anastasopoulos, AACL-IJCNLP 2022)
- PDF:
- https://preview.aclanthology.org/icon-24-ingestion/2022.aacl-main.34.pdf