Impacts of Approaches for Agglutinative-LRL Neural Machine Translation (NMT): A Case Study on Manipuri-English Pair

Gourashyam Moirangthem, Lavinia Nongbri, Samarendra Singh Salam, Kishorjit Nongmeikapam


Abstract
Neural Machine Translation (NMT) is known to be extremely challenging for Low-Resource Languages (LRL) with complex morphology. This work deals with the NMT of a specific LRL called Manipuri/Meeteilon, which is a highly agglutinative language where words have extensive suffixation with limited prefixation. The work studies and discusses the impacts of approaches to mitigate the issues of NMT involving agglutinative LRL in a strictly low-resource setting. The research work experimented with several methods and techniques including subword tokenization, tuning of the selfattention-based NMT model, utilization of monolingual corpus by iterative backtranslation, embedding-based sentence filtering for back translation. This research work in the strictly low resource setting of only 21204 training sentences showed remarkable results with a BLEU score of 28.17 for Manipuri to English translation.
Anthology ID:
2023.icon-1.19
Volume:
Proceedings of the 20th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2023
Address:
Goa University, Goa, India
Editors:
Jyoti D. Pawar, Sobha Lalitha Devi
Venue:
ICON
SIG:
SIGLEX
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
191–201
Language:
URL:
https://aclanthology.org/2023.icon-1.19
DOI:
Bibkey:
Cite (ACL):
Gourashyam Moirangthem, Lavinia Nongbri, Samarendra Singh Salam, and Kishorjit Nongmeikapam. 2023. Impacts of Approaches for Agglutinative-LRL Neural Machine Translation (NMT): A Case Study on Manipuri-English Pair. In Proceedings of the 20th International Conference on Natural Language Processing (ICON), pages 191–201, Goa University, Goa, India. NLP Association of India (NLPAI).
Cite (Informal):
Impacts of Approaches for Agglutinative-LRL Neural Machine Translation (NMT): A Case Study on Manipuri-English Pair (Moirangthem et al., ICON 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2023.icon-1.19.pdf