Sajay Raj


2025

pdf bib
Challenge Track: LoRAs in All Directions: Directional Adapters and Noisy-Channel Reranking for Indic MT
Sajay Raj
Proceedings of the 1st Workshop on Multimodal Models for Low-Resource Contexts and Social Impact (MMLoSo 2025)

Low-resource machine translation for Indic languages remains challenging, especially when high-resource languages such as Hindi and English must be translated to and from very low-resource, grammatically rich languages like Bhili, Mundari, Santali, and Gondi.We describe our winning system for the MMLoSo 2025 Shared Task in this setting. We start from a strong pretrained Indic MT backbone, IndicTrans2, and fine-tune it jointly on all translation directions, pushing the model close to memorization under strict data constraints. On top of this backbone, we add direction-specific low-rank adapters (LoRA) that allow each language pair to specialize while still sharing most parameters. At inference time, we further couple these directional adapters through a noisy-channel objective, in which forward and reverse models jointly score a set of candidate translations, encouraging outputs that are both fluent in the target language and informative about the source. This combination of shared pretraining, directional parameter-efficient adaptation, and noisy-channel reranking substantially improves over a strong fine-tuned baseline and achieves the top overall score on the shared-task leaderboard. We release our codebase at https://github.com/SajayR/LoRA-in-All-Directions
Search
Co-authors
    Venues
    Fix author