Meet Changes with Constancy: Learning Invariance in Multi-Source Translation
Jianfeng Liu, Ling Luo, Xiang Ao, Yan Song, Haoran Xu, Jian Ye
Abstract
Multi-source neural machine translation aims to translate from parallel sources of information (e.g. languages, images, etc.) to a single target language, which has shown better performance than most one-to-one systems. Despite the remarkable success of existing models, they usually neglect the fact that multiple source inputs may have inconsistencies. Such differences might bring noise to the task and limit the performance of existing multi-source NMT approaches due to their indiscriminate usage of input sources for target word predictions. In this paper, we attempt to leverage the potential complementary information among distinct sources and alleviate the occasional conflicts of them. To accomplish that, we propose a source invariance network to learn the invariant information of parallel sources. Such network can be easily integrated with multi-encoder based multi-source NMT methods (e.g. multi-encoder RNN and transformer) to enhance the translation results. Extensive experiments on two multi-source translation tasks demonstrate that the proposed approach not only achieves clear gains in translation quality but also captures implicit invariance between different sources.- Anthology ID:
- 2020.coling-main.97
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 1122–1132
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.97
- DOI:
- 10.18653/v1/2020.coling-main.97
- Cite (ACL):
- Jianfeng Liu, Ling Luo, Xiang Ao, Yan Song, Haoran Xu, and Jian Ye. 2020. Meet Changes with Constancy: Learning Invariance in Multi-Source Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1122–1132, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- Meet Changes with Constancy: Learning Invariance in Multi-Source Translation (Liu et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.97.pdf