Exploiting Cross-Sentence Context for Neural Machine Translation

Longyue Wang, Zhaopeng Tu, Andy Way, Qun Liu


Abstract
In translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a cross-sentence context-aware approach and investigate the influence of historical contextual information on the performance of neural machine translation (NMT). First, this history is summarized in a hierarchical way. We then integrate the historical representation into NMT in two strategies: 1) a warm-start of encoder and decoder states, and 2) an auxiliary context source for updating decoder states. Experimental results on a large Chinese-English translation task show that our approach significantly improves upon a strong attention-based NMT system by up to +2.1 BLEU points.
Anthology ID:
D17-1301
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2826–2831
Language:
URL:
https://aclanthology.org/D17-1301
DOI:
10.18653/v1/D17-1301
Bibkey:
Cite (ACL):
Longyue Wang, Zhaopeng Tu, Andy Way, and Qun Liu. 2017. Exploiting Cross-Sentence Context for Neural Machine Translation. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2826–2831, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Exploiting Cross-Sentence Context for Neural Machine Translation (Wang et al., EMNLP 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/D17-1301.pdf
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/D17-1301.mp4
Code
 tuzhaopeng/LC-NMT