Improving Neural Machine Translation with Soft Template Prediction

Jian Yang, Shuming Ma, Dongdong Zhang, Zhoujun Li, Ming Zhou


Abstract
Although neural machine translation (NMT) has achieved significant progress in recent years, most previous NMT models only depend on the source text to generate translation. Inspired by the success of template-based and syntax-based approaches in other fields, we propose to use extracted templates from tree structures as soft target templates to guide the translation procedure. In order to learn the syntactic structure of the target sentences, we adopt constituency-based parse tree to generate candidate templates. We incorporate the template information into the encoder-decoder framework to jointly utilize the templates and source text. Experiments show that our model significantly outperforms the baseline models on four benchmarks and demonstrates the effectiveness of soft target templates.
Anthology ID:
2020.acl-main.531
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5979–5989
Language:
URL:
https://aclanthology.org/2020.acl-main.531
DOI:
10.18653/v1/2020.acl-main.531
Bibkey:
Cite (ACL):
Jian Yang, Shuming Ma, Dongdong Zhang, Zhoujun Li, and Ming Zhou. 2020. Improving Neural Machine Translation with Soft Template Prediction. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5979–5989, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Neural Machine Translation with Soft Template Prediction (Yang et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/2020.acl-main.531.pdf
Video:
 http://slideslive.com/38929072