Abstract
This paper presents our machine translation system that developed for the WAT2016 evalua-tion tasks of ja-en, ja-zh, en-ja, zh-ja, JPCja-en, JPCja-zh, JPCen-ja, JPCzh-ja. We build our system based on encoder–decoder framework by integrating recurrent neural network (RNN) and gate recurrent unit (GRU), and we also adopt an attention mechanism for solving the problem of information loss. Additionally, we propose a simple translation-specific approach to resolve the unknown word translation problem. Experimental results show that our system performs better than the baseline statistical machine translation (SMT) systems in each task. Moreover, it shows that our proposed approach of unknown word translation performs effec-tively improvement of translation results.- Anthology ID:
- W16-4608
- Volume:
- Proceedings of the 3rd Workshop on Asian Translation (WAT2016)
- Month:
- December
- Year:
- 2016
- Address:
- Osaka, Japan
- Venue:
- WAT
- SIG:
- Publisher:
- The COLING 2016 Organizing Committee
- Note:
- Pages:
- 104–110
- Language:
- URL:
- https://aclanthology.org/W16-4608
- DOI:
- Cite (ACL):
- Shaotong Li, JinAn Xu, Yufeng Chen, and Yujie Zhang. 2016. System Description of bjtu_nlp Neural Machine Translation System. In Proceedings of the 3rd Workshop on Asian Translation (WAT2016), pages 104–110, Osaka, Japan. The COLING 2016 Organizing Committee.
- Cite (Informal):
- System Description of bjtu_nlp Neural Machine Translation System (Li et al., WAT 2016)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/W16-4608.pdf
- Data
- ASPEC