Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing

Jetic Gū, Hassan S. Shavarani, Anoop Sarkar


Abstract
The addition of syntax-aware decoding in Neural Machine Translation (NMT) systems requires an effective tree-structured neural network, a syntax-aware attention model and a language generation model that is sensitive to sentence structure. Recent approaches resort to sequential decoding by adding additional neural network units to capture bottom-up structural information, or serialising structured data into sequence. We exploit a top-down tree-structured model called DRNN (Doubly-Recurrent Neural Networks) first proposed by Alvarez-Melis and Jaakola (2017) to create an NMT model called Seq2DRNN that combines a sequential encoder with tree-structured decoding augmented with a syntax-aware attention model. Unlike previous approaches to syntax-based NMT which use dependency parsing models our method uses constituency parsing which we argue provides useful information for translation. In addition, we use the syntactic structure of the sentence to add new connections to the tree-structured decoder neural network (Seq2DRNN+SynC). We compare our NMT model with sequential and state of the art syntax-based NMT models and show that our model produces more fluent translations with better reordering. Since our model is capable of doing translation and constituency parsing at the same time we also compare our parsing accuracy against other neural parsing models.
Anthology ID:
D18-1037
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
401–413
Language:
URL:
https://aclanthology.org/D18-1037
DOI:
10.18653/v1/D18-1037
Bibkey:
Cite (ACL):
Jetic Gū, Hassan S. Shavarani, and Anoop Sarkar. 2018. Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 401–413, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Top-down Tree Structured Decoding with Syntactic Connections for Neural Machine Translation and Parsing (Gū et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp22-frontmatter/D18-1037.pdf