Pointer-Generator Networks for Low-Resource Machine Translation: Don’t Copy That!

Niyati Bafna, Philipp Koehn, David Yarowsky


Abstract
While Transformer-based neural machine translation (NMT) is very effective in high-resource settings, many languages lack the necessary large parallel corpora to benefit from it. In the context of low-resource (LR) MT between two closely-related languages, a natural intuition is to seek benefits from structural “shortcuts”, such as copying subwords from the source to the target, given that such language pairs often share a considerable number of identical words, cognates, and borrowings. We test Pointer-Generator Networks for this purpose for six language pairs over a variety of resource ranges, and find weak improvements for most settings. However, analysis shows that the model does not show greater improvements for closely-related vs. more distant language pairs, or for lower resource ranges, and that the models do not exhibit the expected usage of the mechanism for shared subwords. Our discussion of the reasons for this behaviour highlights several general challenges for LR NMT, such as modern tokenization strategies, noisy real-world conditions, and linguistic complexities. We call for better scrutiny of linguistically motivated improvements to NMT given the blackbox nature of Transformer models, as well as for a focus on the above problems in the field.
Anthology ID:
2024.insights-1.9
Volume:
Proceedings of the Fifth Workshop on Insights from Negative Results in NLP
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Shabnam Tafreshi, Arjun Akula, João Sedoc, Aleksandr Drozd, Anna Rogers, Anna Rumshisky
Venues:
insights | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
60–72
Language:
URL:
https://aclanthology.org/2024.insights-1.9
DOI:
10.18653/v1/2024.insights-1.9
Bibkey:
Cite (ACL):
Niyati Bafna, Philipp Koehn, and David Yarowsky. 2024. Pointer-Generator Networks for Low-Resource Machine Translation: Don’t Copy That!. In Proceedings of the Fifth Workshop on Insights from Negative Results in NLP, pages 60–72, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Pointer-Generator Networks for Low-Resource Machine Translation: Don’t Copy That! (Bafna et al., insights-WS 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2024.insights-1.9.pdf