Context Gates for Neural Machine Translation

Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, Hang Li


Abstract
In neural machine translation (NMT), generation of a target word depends on both source and target contexts. We find that source contexts have a direct impact on the adequacy of a translation while target contexts affect the fluency. Intuitively, generation of a content word should rely more on the source context and generation of a functional word should rely more on the target context. Due to the lack of effective control over the influence from source and target contexts, conventional NMT tends to yield fluent but inadequate translations. To address this problem, we propose context gates which dynamically control the ratios at which source and target contexts contribute to the generation of target words. In this way, we can enhance both the adequacy and fluency of NMT with more careful control of the information flow from contexts. Experiments show that our approach significantly improves upon a standard attention-based NMT system by +2.3 BLEU points.
Anthology ID:
Q17-1007
Volume:
Transactions of the Association for Computational Linguistics, Volume 5
Month:
Year:
2017
Address:
Cambridge, MA
Editors:
Lillian Lee, Mark Johnson, Kristina Toutanova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
87–99
Language:
URL:
https://aclanthology.org/Q17-1007
DOI:
10.1162/tacl_a_00048
Bibkey:
Cite (ACL):
Zhaopeng Tu, Yang Liu, Zhengdong Lu, Xiaohua Liu, and Hang Li. 2017. Context Gates for Neural Machine Translation. Transactions of the Association for Computational Linguistics, 5:87–99.
Cite (Informal):
Context Gates for Neural Machine Translation (Tu et al., TACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/Q17-1007.pdf
Video:
 https://preview.aclanthology.org/nschneid-patch-4/Q17-1007.mp4
Code
 tuzhaopeng/nmt +  additional community code