Incorporating Word Reordering Knowledge into Attention-based Neural Machine Translation

Jinchao Zhang, Mingxuan Wang, Qun Liu, Jie Zhou


Abstract
This paper proposes three distortion models to explicitly incorporate the word reordering knowledge into attention-based Neural Machine Translation (NMT) for further improving translation performance. Our proposed models enable attention mechanism to attend to source words regarding both the semantic requirement and the word reordering penalty. Experiments on Chinese-English translation show that the approaches can improve word alignment quality and achieve significant translation improvements over a basic attention-based NMT by large margins. Compared with previous works on identical corpora, our system achieves the state-of-the-art performance on translation quality.
Anthology ID:
P17-1140
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1524–1534
Language:
URL:
https://aclanthology.org/P17-1140
DOI:
10.18653/v1/P17-1140
Bibkey:
Cite (ACL):
Jinchao Zhang, Mingxuan Wang, Qun Liu, and Jie Zhou. 2017. Incorporating Word Reordering Knowledge into Attention-based Neural Machine Translation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1524–1534, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Incorporating Word Reordering Knowledge into Attention-based Neural Machine Translation (Zhang et al., ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/P17-1140.pdf