Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations

Meishan Zhang, Zhenghua Li, Guohong Fu, Min Zhang


Abstract
Syntax has been demonstrated highly effective in neural machine translation (NMT). Previous NMT models integrate syntax by representing 1-best tree outputs from a well-trained parsing system, e.g., the representative Tree-RNN and Tree-Linearization methods, which may suffer from error propagation. In this work, we propose a novel method to integrate source-side syntax implicitly for NMT. The basic idea is to use the intermediate hidden representations of a well-trained end-to-end dependency parser, which are referred to as syntax-aware word representations (SAWRs). Then, we simply concatenate such SAWRs with ordinary word embeddings to enhance basic NMT models. The method can be straightforwardly integrated into the widely-used sequence-to-sequence (Seq2Seq) NMT models. We start with a representative RNN-based Seq2Seq baseline system, and test the effectiveness of our proposed method on two benchmark datasets of the Chinese-English and English-Vietnamese translation tasks, respectively. Experimental results show that the proposed approach is able to bring significant BLEU score improvements on the two datasets compared with the baseline, 1.74 points for Chinese-English translation and 0.80 point for English-Vietnamese translation, respectively. In addition, the approach also outperforms the explicit Tree-RNN and Tree-Linearization methods.
Anthology ID:
N19-1118
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1151–1161
Language:
URL:
https://aclanthology.org/N19-1118
DOI:
10.18653/v1/N19-1118
Bibkey:
Cite (ACL):
Meishan Zhang, Zhenghua Li, Guohong Fu, and Min Zhang. 2019. Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 1151–1161, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations (Zhang et al., NAACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/N19-1118.pdf
Video:
 https://preview.aclanthology.org/dois-2013-emnlp/N19-1118.mp4