Abstract
Neural machine translation (NMT) has achieved great success due to the ability to generate high-quality sentences. Compared with human translations, one of the drawbacks of current NMT is that translations are not usually faithful to the input, e.g., omitting information or generating unrelated fragments, which inevitably decreases the overall quality, especially for human readers. In this paper, we propose a novel training strategy with a multi-task learning paradigm to build a faithfulness enhanced NMT model (named FEnmt). During the NMT training process, we sample a subset from the training set and translate them to get fragments that have been mistranslated. Afterward, the proposed multi-task learning paradigm is employed on both encoder and decoder to guide NMT to correctly translate these fragments. Both automatic and human evaluations verify that our FEnmt could improve translation quality by effectively reducing unfaithful translations.- Anthology ID:
- 2020.emnlp-main.212
- Volume:
- Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2675–2684
- Language:
- URL:
- https://aclanthology.org/2020.emnlp-main.212
- DOI:
- 10.18653/v1/2020.emnlp-main.212
- Cite (ACL):
- Rongxiang Weng, Heng Yu, Xiangpeng Wei, and Weihua Luo. 2020. Towards Enhancing Faithfulness for Neural Machine Translation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2675–2684, Online. Association for Computational Linguistics.
- Cite (Informal):
- Towards Enhancing Faithfulness for Neural Machine Translation (Weng et al., EMNLP 2020)
- PDF:
- https://preview.aclanthology.org/add_acl24_videos/2020.emnlp-main.212.pdf