Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks

Biao Zhang, Deyi Xiong, Jinsong Su, Qian Lin, Huiji Zhang

[How to correct problems with metadata yourself]


Abstract
In this paper, we propose an additionsubtraction twin-gated recurrent network (ATR) to simplify neural machine translation. The recurrent units of ATR are heavily simplified to have the smallest number of weight matrices among units of all existing gated RNNs. With the simple addition and subtraction operation, we introduce a twin-gated mechanism to build input and forget gates which are highly correlated. Despite this simplification, the essential non-linearities and capability of modeling long-distance dependencies are preserved. Additionally, the proposed ATR is more transparent than LSTM/GRU due to the simplification. Forward self-attention can be easily established in ATR, which makes the proposed network interpretable. Experiments on WMT14 translation tasks demonstrate that ATR-based neural machine translation can yield competitive performance on English-German and English-French language pairs in terms of both translation quality and speed. Further experiments on NIST Chinese-English translation, natural language inference and Chinese word segmentation verify the generality and applicability of ATR on different natural language processing tasks.
Anthology ID:
D18-1459
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4273–4283
Language:
URL:
https://aclanthology.org/D18-1459
DOI:
10.18653/v1/D18-1459
Bibkey:
Cite (ACL):
Biao Zhang, Deyi Xiong, Jinsong Su, Qian Lin, and Huiji Zhang. 2018. Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4273–4283, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Simplifying Neural Machine Translation with Addition-Subtraction Twin-Gated Recurrent Networks (Zhang et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/D18-1459.pdf
Attachment:
 D18-1459.Attachment.zip
Video:
 https://preview.aclanthology.org/teach-a-man-to-fish/D18-1459.mp4
Code
 bzhangGo/zero +  additional community code