Abstract
The paper presents this year’s CUNI submissions to the WAT 2017 Translation Task focusing on the Japanese-English translation, namely Scientific papers subtask, Patents subtask and Newswire subtask. We compare two neural network architectures, the standard sequence-to-sequence with attention (Seq2Seq) and an architecture using convolutional sentence encoder (FBConv2Seq), both implemented in the NMT framework Neural Monkey that we currently participate in developing. We also compare various types of preprocessing of the source Japanese sentences and their impact on the overall results. Furthermore, we include the results of our experiments with out-of-domain data obtained by combining the corpora provided for each subtask.- Anthology ID:
- W17-5715
- Volume:
- Proceedings of the 4th Workshop on Asian Translation (WAT2017)
- Month:
- November
- Year:
- 2017
- Address:
- Taipei, Taiwan
- Editors:
- Toshiaki Nakazawa, Isao Goto
- Venue:
- WAT
- SIG:
- Publisher:
- Asian Federation of Natural Language Processing
- Note:
- Pages:
- 154–159
- Language:
- URL:
- https://aclanthology.org/W17-5715
- DOI:
- Cite (ACL):
- Tom Kocmi, Dušan Variš, and Ondřej Bojar. 2017. CUNI NMT System for WAT 2017 Translation Tasks. In Proceedings of the 4th Workshop on Asian Translation (WAT2017), pages 154–159, Taipei, Taiwan. Asian Federation of Natural Language Processing.
- Cite (Informal):
- CUNI NMT System for WAT 2017 Translation Tasks (Kocmi et al., WAT 2017)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/W17-5715.pdf