Counter-Interference Adapter for Multilingual Machine Translation

Yaoming Zhu, Jiangtao Feng, Chengqi Zhao, Mingxuan Wang, Lei Li


Abstract
Developing a unified multilingual model has been a long pursuing goal for machine translation. However, existing approaches suffer from performance degradation - a single multilingual model is inferior to separately trained bilingual ones on rich-resource languages. We conjecture that such a phenomenon is due to interference brought by joint training with multiple languages. To accommodate the issue, we propose CIAT, an adapted Transformer model with a small parameter overhead for multilingual machine translation. We evaluate CIAT on multiple benchmark datasets, including IWSLT, OPUS-100, and WMT. Experiments show that the CIAT consistently outperforms strong multilingual baselines on 64 of total 66 language directions, 42 of which have above 0.5 BLEU improvement.
Anthology ID:
2021.findings-emnlp.240
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2812–2823
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.240
DOI:
10.18653/v1/2021.findings-emnlp.240
Bibkey:
Cite (ACL):
Yaoming Zhu, Jiangtao Feng, Chengqi Zhao, Mingxuan Wang, and Lei Li. 2021. Counter-Interference Adapter for Multilingual Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 2812–2823, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Counter-Interference Adapter for Multilingual Machine Translation (Zhu et al., Findings 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2021.findings-emnlp.240.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/2021.findings-emnlp.240.mp4
Code
 yaoming95/ciat
Data
OPUS-100