Abstract
Knowledge transfer across languages is crucial for multilingual neural machine translation. In this paper, we propose language branch (LB) gated multilingual neural machine translation that encourages knowledge transfer within the same language branch with a LB-gated module that is integrated into both the encoder and decoder. The LB-gated module distinguishes LB-specific parameters from global parameters shared by all languages and routes languages from the same LB to the corresponding LB-specific network. Comprehensive experiments on the OPUS-100 dataset show that the proposed approach substantially improves translation quality on both middle- and low-resource languages over previous methods. Further analysis demonstrates its ability in learning similarities between language branches.- Anthology ID:
- 2022.coling-1.447
- Volume:
- Proceedings of the 29th International Conference on Computational Linguistics
- Month:
- October
- Year:
- 2022
- Address:
- Gyeongju, Republic of Korea
- Editors:
- Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 5046–5053
- Language:
- URL:
- https://aclanthology.org/2022.coling-1.447
- DOI:
- Cite (ACL):
- Haoran Sun and Deyi Xiong. 2022. Language Branch Gated Multilingual Neural Machine Translation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5046–5053, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
- Cite (Informal):
- Language Branch Gated Multilingual Neural Machine Translation (Sun & Xiong, COLING 2022)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2022.coling-1.447.pdf
- Data
- OPUS-100