Abstract
This paper describes the NICT neural machine translation system submitted at the NMT-2018 shared task. A characteristic of our approach is the introduction of self-training. Since our self-training does not change the model structure, it does not influence the efficiency of translation, such as the translation speed. The experimental results showed that the translation quality improved not only in the sequence-to-sequence (seq-to-seq) models but also in the transformer models.- Anthology ID:
- W18-2713
- Volume:
- Proceedings of the 2nd Workshop on Neural Machine Translation and Generation
- Month:
- July
- Year:
- 2018
- Address:
- Melbourne, Australia
- Editors:
- Alexandra Birch, Andrew Finch, Thang Luong, Graham Neubig, Yusuke Oda
- Venue:
- NGT
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 110–115
- Language:
- URL:
- https://aclanthology.org/W18-2713
- DOI:
- 10.18653/v1/W18-2713
- Cite (ACL):
- Kenji Imamura and Eiichiro Sumita. 2018. NICT Self-Training Approach to Neural Machine Translation at NMT-2018. In Proceedings of the 2nd Workshop on Neural Machine Translation and Generation, pages 110–115, Melbourne, Australia. Association for Computational Linguistics.
- Cite (Informal):
- NICT Self-Training Approach to Neural Machine Translation at NMT-2018 (Imamura & Sumita, NGT 2018)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/W18-2713.pdf