Improving Language Transfer Capability of Decoder-only Architecture in Multilingual Neural Machine Translation
Zhi Qu, Yiran Wang, Chenchen Ding, Hideki Tanaka, Masao Utiyama, Taro Watanabe
Abstract
Existing multilingual neural machine translation (MNMT) approaches mainly focus on improving models with the encoder-decoder architecture to translate multiple languages. However, decoder-only architecture has been explored less in MNMT due to its underperformance when trained on parallel data solely. In this work, we attribute the issue of the decoder-only architecture to its lack of language transfer capability. Specifically, the decoder-only architecture is insufficient in encoding source tokens with the target language features. We propose dividing the decoding process into two stages so that target tokens are explicitly excluded in the first stage to implicitly boost the transfer capability across languages. Additionally, we impose contrastive learning on translation instructions, resulting in improved performance in zero-shot translation. We conduct experiments on TED-19 and OPUS-100 datasets, considering both training from scratch and fine-tuning scenarios.results show that, compared to the encoder-decoder architecture, our methods not only perform competitively in supervised translations but also achieve improvements of up to 3.39 BLEU, 6.99 chrF++, 3.22 BERTScore, and 4.81 COMET in zero-shot translations. We release our codes at https://github.com/zhiqu22/PhasedDecoder.- Anthology ID:
- 2025.mrl-main.13
- Volume:
- Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025)
- Month:
- November
- Year:
- 2025
- Address:
- Suzhuo, China
- Editors:
- David Ifeoluwa Adelani, Catherine Arnett, Duygu Ataman, Tyler A. Chang, Hila Gonen, Rahul Raja, Fabian Schmidt, David Stap, Jiayi Wang
- Venues:
- MRL | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 178–195
- Language:
- URL:
- https://preview.aclanthology.org/name-variant-enfa-fane/2025.mrl-main.13/
- DOI:
- 10.18653/v1/2025.mrl-main.13
- Cite (ACL):
- Zhi Qu, Yiran Wang, Chenchen Ding, Hideki Tanaka, Masao Utiyama, and Taro Watanabe. 2025. Improving Language Transfer Capability of Decoder-only Architecture in Multilingual Neural Machine Translation. In Proceedings of the 5th Workshop on Multilingual Representation Learning (MRL 2025), pages 178–195, Suzhuo, China. Association for Computational Linguistics.
- Cite (Informal):
- Improving Language Transfer Capability of Decoder-only Architecture in Multilingual Neural Machine Translation (Qu et al., MRL 2025)
- PDF:
- https://preview.aclanthology.org/name-variant-enfa-fane/2025.mrl-main.13.pdf