A Tree-based Decoder for Neural Machine Translation

Xinyi Wang, Hieu Pham, Pengcheng Yin, Graham Neubig


Abstract
Recent advances in Neural Machine Translation (NMT) show that adding syntactic information to NMT systems can improve the quality of their translations. Most existing work utilizes some specific types of linguistically-inspired tree structures, like constituency and dependency parse trees. This is often done via a standard RNN decoder that operates on a linearized target tree structure. However, it is an open question of what specific linguistic formalism, if any, is the best structural representation for NMT. In this paper, we (1) propose an NMT model that can naturally generate the topology of an arbitrary tree structure on the target side, and (2) experiment with various target tree structures. Our experiments show the surprising result that our model delivers the best improvements with balanced binary trees constructed without any linguistic knowledge; this model outperforms standard seq2seq models by up to 2.1 BLEU points, and other methods for incorporating target-side syntax by up to 0.7 BLEU.
Anthology ID:
D18-1509
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4772–4777
Language:
URL:
https://aclanthology.org/D18-1509
DOI:
10.18653/v1/D18-1509
Bibkey:
Cite (ACL):
Xinyi Wang, Hieu Pham, Pengcheng Yin, and Graham Neubig. 2018. A Tree-based Decoder for Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4772–4777, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Tree-based Decoder for Neural Machine Translation (Wang et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-bitext-workshop/D18-1509.pdf
Attachment:
 D18-1509.Attachment.pdf
Video:
 https://preview.aclanthology.org/ingest-bitext-workshop/D18-1509.mp4
Code
 cindyxinyiwang/TrDec_pytorch