Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019

Aizhan Imankulova, Masahiro Kaneko, Mamoru Komachi


Abstract
We introduce our system that is submitted to the News Commentary task (Japanese<->Russian) of the 6th Workshop on Asian Translation. The goal of this shared task is to study extremely low resource situations for distant language pairs. It is known that using parallel corpora of different language pair as training data is effective for multilingual neural machine translation model in extremely low resource scenarios. Therefore, to improve the translation quality of Japanese<->Russian language pair, our method leverages other in-domain Japanese-English and English-Russian parallel corpora as additional training data for our multilingual NMT model.
Anthology ID:
D19-5221
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Toshiaki Nakazawa, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Nobushige Doi, Yusuke Oda, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
165–170
Language:
URL:
https://aclanthology.org/D19-5221
DOI:
10.18653/v1/D19-5221
Bibkey:
Cite (ACL):
Aizhan Imankulova, Masahiro Kaneko, and Mamoru Komachi. 2019. Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019. In Proceedings of the 6th Workshop on Asian Translation, pages 165–170, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Japanese-Russian TMU Neural Machine Translation System using Multilingual Model for WAT 2019 (Imankulova et al., WAT 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/naacl24-info/D19-5221.pdf