Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models

Abteen Ebrahimi, Arya D. McCarthy, Arturo Oncevay, John E. Ortega, Luis Chiruzzo, Gustavo Giménez-Lugo, Rolando Coto-Solano, Katharina Kann


Abstract
Large multilingual models have inspired a new class of word alignment methods, which work well for the model’s pretraining languages. However, the languages most in need of automatic alignment are low-resource and, thus, not typically included in the pretraining data. In this work, we ask: How do modern aligners perform on unseen languages, and are they better than traditional methods? We contribute gold-standard alignments for Bribri–Spanish, Guarani–Spanish, Quechua–Spanish, and Shipibo-Konibo–Spanish. With these, we evaluate state-of-the-art aligners with and without model adaptation to the target language. Finally, we also evaluate the resulting alignments extrinsically through two downstream tasks: named entity recognition and part-of-speech tagging. We find that although transformer-based methods generally outperform traditional models, the two classes of approach remain competitive with each other.
Anthology ID:
2023.eacl-main.280
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3912–3926
Language:
URL:
https://aclanthology.org/2023.eacl-main.280
DOI:
10.18653/v1/2023.eacl-main.280
Bibkey:
Cite (ACL):
Abteen Ebrahimi, Arya D. McCarthy, Arturo Oncevay, John E. Ortega, Luis Chiruzzo, Gustavo Giménez-Lugo, Rolando Coto-Solano, and Katharina Kann. 2023. Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3912–3926, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Meeting the Needs of Low-Resource Languages: The Value of Automatic Alignments via Pretrained Models (Ebrahimi et al., EACL 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.eacl-main.280.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/2023.eacl-main.280.mp4