Towards Inducing Long-Context Abilities in Multilingual Neural Machine Translation Models

Varun Gumma, Pranjal A Chitale, Kalika Bali


Abstract
Neural Machine Translation (NMT) models have traditionally used Sinusoidal Positional Embeddings (PEs), which often struggle to capture long-range dependencies and are inefficient for handling extended context or document-level translation tasks. This work addresses the challenge of transitioning pre-trained NMT models from absolute Sinusoidal PEs to Relative PEs, such as RoPE and ALiBi, without compromising performance. We demonstrate that parameter-efficient fine-tuning, using only a small amount of high-quality data, can successfully facilitate this transition. Experimental results indicate that switching from Sinusoidal to Relative PEs results in competitive translation quality on sentence-level evaluation benchmarks. Additionally, models trained with RoPE consistently outperform those using ALiBi and Sinusoidal PEs on document-level benchmarks across both string-based metrics and qualitative evaluations. Moreover, we find that a small amount of long-context data in a few languages is sufficient for cross-lingual length generalization, thereby inducing long-context capabilities.
Anthology ID:
2025.naacl-long.366
Volume:
Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
April
Year:
2025
Address:
Albuquerque, New Mexico
Editors:
Luis Chiruzzo, Alan Ritter, Lu Wang
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7158–7170
Language:
URL:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-long.366/
DOI:
Bibkey:
Cite (ACL):
Varun Gumma, Pranjal A Chitale, and Kalika Bali. 2025. Towards Inducing Long-Context Abilities in Multilingual Neural Machine Translation Models. In Proceedings of the 2025 Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 7158–7170, Albuquerque, New Mexico. Association for Computational Linguistics.
Cite (Informal):
Towards Inducing Long-Context Abilities in Multilingual Neural Machine Translation Models (Gumma et al., NAACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/fix-sig-urls/2025.naacl-long.366.pdf