Our Neural Machine Translation Systems for WAT 2019

Wei Yang, Jun Ogata


Abstract
In this paper, we describe our Neural Machine Translation (NMT) systems for the WAT 2019 translation tasks we focus on. This year we participate in scientific paper tasks and focus on the language pair between English and Japanese. We use Transformer model through our work in this paper to explore and experience the powerful of the Transformer architecture relying on self-attention mechanism. We use different NMT toolkit/library as the implementation of training the Transformer model. For word segmentation, we use different subword segmentation strategies while using different toolkit/library. We not only give the translation accuracy obtained based on absolute position encodings that introduced in the Transformer model, but also report the the improvements in translation accuracy while replacing absolute position encodings with relative position representations. We also ensemble several independent trained Transformer models to further improve the translation accuracy.
Anthology ID:
D19-5220
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Toshiaki Nakazawa, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Nobushige Doi, Yusuke Oda, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
159–164
Language:
URL:
https://aclanthology.org/D19-5220
DOI:
10.18653/v1/D19-5220
Bibkey:
Cite (ACL):
Wei Yang and Jun Ogata. 2019. Our Neural Machine Translation Systems for WAT 2019. In Proceedings of the 6th Workshop on Asian Translation, pages 159–164, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Our Neural Machine Translation Systems for WAT 2019 (Yang & Ogata, WAT 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl-24-ws-corrections/D19-5220.pdf