Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages

Wietse de Vries, Martijn Wieling, Malvina Nissim


Abstract
Cross-lingual transfer learning with large multilingual pre-trained models can be an effective approach for low-resource languages with no labeled training data. Existing evaluations of zero-shot cross-lingual generalisability of large pre-trained models use datasets with English training data, and test data in a selection of target languages. We explore a more extensive transfer learning setup with 65 different source languages and 105 target languages for part-of-speech tagging. Through our analysis, we show that pre-training of both source and target language, as well as matching language families, writing systems, word order systems, and lexical-phonetic distance significantly impact cross-lingual performance. The findings described in this paper can be used as indicators of which factors are important for effective zero-shot cross-lingual transfer to zero- and low-resource languages.
Anthology ID:
2022.acl-long.529
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7676–7685
Language:
URL:
https://aclanthology.org/2022.acl-long.529
DOI:
10.18653/v1/2022.acl-long.529
Bibkey:
Cite (ACL):
Wietse de Vries, Martijn Wieling, and Malvina Nissim. 2022. Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7676–7685, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages (de Vries et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-1/2022.acl-long.529.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-1/2022.acl-long.529.mp4
Code
 wietsedv/xpos