Effective Strategies in Zero-Shot Neural Machine Translation

Thanh-Le Ha, Jan Niehues, Alexander Waibel


Abstract
In this paper, we proposed two strategies which can be applied to a multilingual neural machine translation system in order to better tackle zero-shot scenarios despite not having any parallel corpus. The experiments show that they are effective in terms of both performance and computing resources, especially in multilingual translation of unbalanced data in real zero-resourced condition when they alleviate the language bias problem.
Anthology ID:
2017.iwslt-1.15
Volume:
Proceedings of the 14th International Conference on Spoken Language Translation
Month:
December 14-15
Year:
2017
Address:
Tokyo, Japan
Editors:
Sakriani Sakti, Masao Utiyama
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
International Workshop on Spoken Language Translation
Note:
Pages:
105–112
Language:
URL:
https://aclanthology.org/2017.iwslt-1.15
DOI:
Bibkey:
Cite (ACL):
Thanh-Le Ha, Jan Niehues, and Alexander Waibel. 2017. Effective Strategies in Zero-Shot Neural Machine Translation. In Proceedings of the 14th International Conference on Spoken Language Translation, pages 105–112, Tokyo, Japan. International Workshop on Spoken Language Translation.
Cite (Informal):
Effective Strategies in Zero-Shot Neural Machine Translation (Ha et al., IWSLT 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2017.iwslt-1.15.pdf