Completely Modular Fine-tuning for Dynamic Language Adaptation

Zhe Cao, Yusuke Oda, Qianying Liu, Akiko Aizawa, Taro Watanabe


Abstract
Multilingual Fine-tuning of Large Language Models (LLMs) has achieved great advancements in machine translation. However, existing research focuses only on the traditional fine-tuning setting with a fixed set of languages, lacking dynamic adaptability to new ones. Introducing new languages requires retraining and often causes catastrophic forgetting. In this study, we propose a completely modular fine-tuning pipeline that enables dynamic language adaptation for LLMs. Instead of directly fine-tuning on all languages, our approach first trains English-centric input and output LoRA adapters for each language separately, and then merges the corresponding adapters for arbitrary-direction translation without any additional training. Experiments on 12 translation directions of four low-resource and less-supported languages show that modular fine-tuning achieves up to 86% performance of traditional multi-parallel full-parameter fine-tuning, while training only 0.1% parameters and relying solely on English-centric data without any catastrophic forgetting. Furthermore, we perform a comprehensive analysis about the merging ratio, when to merge, and the rationale for using English as a bridge language via Bayesian Optimization and logit lens.
Anthology ID:
2026.findings-eacl.252
Volume:
Findings of the Association for Computational Linguistics: EACL 2026
Month:
March
Year:
2026
Address:
Rabat, Morocco
Editors:
Vera Demberg, Kentaro Inui, Lluís Marquez
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4828–4845
Language:
URL:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.252/
DOI:
Bibkey:
Cite (ACL):
Zhe Cao, Yusuke Oda, Qianying Liu, Akiko Aizawa, and Taro Watanabe. 2026. Completely Modular Fine-tuning for Dynamic Language Adaptation. In Findings of the Association for Computational Linguistics: EACL 2026, pages 4828–4845, Rabat, Morocco. Association for Computational Linguistics.
Cite (Informal):
Completely Modular Fine-tuning for Dynamic Language Adaptation (Cao et al., Findings 2026)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-eacl/2026.findings-eacl.252.pdf
Checklist:
 2026.findings-eacl.252.checklist.pdf