Exploiting Deep Representations for Neural Machine Translation

Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Shuming Shi, Tong Zhang


Abstract
Advanced neural machine translation (NMT) models generally implement encoder and decoder as multiple layers, which allows systems to model complex functions and capture complicated linguistic structures. However, only the top layers of encoder and decoder are leveraged in the subsequent process, which misses the opportunity to exploit the useful information embedded in other layers. In this work, we propose to simultaneously expose all of these signals with layer aggregation and multi-layer attention mechanisms. In addition, we introduce an auxiliary regularization term to encourage different layers to capture diverse information. Experimental results on widely-used WMT14 English-German and WMT17 Chinese-English translation data demonstrate the effectiveness and universality of the proposed approach.
Anthology ID:
D18-1457
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4253–4262
Language:
URL:
https://aclanthology.org/D18-1457
DOI:
10.18653/v1/D18-1457
Bibkey:
Cite (ACL):
Zi-Yi Dou, Zhaopeng Tu, Xing Wang, Shuming Shi, and Tong Zhang. 2018. Exploiting Deep Representations for Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4253–4262, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Exploiting Deep Representations for Neural Machine Translation (Dou et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/D18-1457.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/D18-1457.mp4