Tencent Neural Machine Translation Systems for WMT18

Mingxuan Wang, Li Gong, Wenhuan Zhu, Jun Xie, Chao Bian


Abstract
We participated in the WMT 2018 shared news translation task on English↔Chinese language pair. Our systems are based on attentional sequence-to-sequence models with some form of recursion and self-attention. Some data augmentation methods are also introduced to improve the translation performance. The best translation result is obtained with ensemble and reranking techniques. Our Chinese→English system achieved the highest cased BLEU score among all 16 submitted systems, and our English→Chinese system ranked the third out of 18 submitted systems.
Anthology ID:
W18-6429
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Editors:
Ondřej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana Neves, Matt Post, Lucia Specia, Marco Turchi, Karin Verspoor
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
522–527
Language:
URL:
https://aclanthology.org/W18-6429
DOI:
10.18653/v1/W18-6429
Bibkey:
Cite (ACL):
Mingxuan Wang, Li Gong, Wenhuan Zhu, Jun Xie, and Chao Bian. 2018. Tencent Neural Machine Translation Systems for WMT18. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 522–527, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
Tencent Neural Machine Translation Systems for WMT18 (Wang et al., WMT 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ml4al-ingestion/W18-6429.pdf