Exploiting Sentential Context for Neural Machine Translation

Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi


Abstract
In this work, we present novel approaches to exploit sentential context for neural machine translation (NMT). Specifically, we show that a shallow sentential context extracted from the top encoder layer only, can improve translation performance via contextualizing the encoding representations of individual words. Next, we introduce a deep sentential context, which aggregates the sentential context representations from all of the internal layers of the encoder to form a more comprehensive context representation. Experimental results on the WMT14 English-German and English-French benchmarks show that our model consistently improves performance over the strong Transformer model, demonstrating the necessity and effectiveness of exploiting sentential context for NMT.
Anthology ID:
P19-1624
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6197–6203
Language:
URL:
https://aclanthology.org/P19-1624
DOI:
10.18653/v1/P19-1624
Bibkey:
Cite (ACL):
Xing Wang, Zhaopeng Tu, Longyue Wang, and Shuming Shi. 2019. Exploiting Sentential Context for Neural Machine Translation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 6197–6203, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Exploiting Sentential Context for Neural Machine Translation (Wang et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-3/P19-1624.pdf