KIT’s Multilingual Neural Machine Translation systems for IWSLT 2017

Ngoc-Quan Pham, Matthias Sperber, Elizabeth Salesky, Thanh-Le Ha, Jan Niehues, Alexander Waibel


Abstract
In this paper, we present KIT’s multilingual neural machine translation (NMT) systems for the IWSLT 2017 evaluation campaign machine translation (MT) and spoken language translation (SLT) tasks. For our MT task submissions, we used our multi-task system, modified from a standard attentional neural machine translation framework, instead of building 20 individual NMT systems. We investigated different architectures as well as different data corpora in training such a multilingual system. We also suggested an effective adaptation scheme for multilingual systems which brings great improvements compared to monolingual systems. For the SLT track, in addition to a monolingual neural translation system used to generate correct punctuations and true cases of the data prior to training our multilingual system, we introduced a noise model in order to make our system more robust. Results show that our novel modifications improved our systems considerably on all tasks.
Anthology ID:
2017.iwslt-1.6
Volume:
Proceedings of the 14th International Conference on Spoken Language Translation
Month:
December 14-15
Year:
2017
Address:
Tokyo, Japan
Editors:
Sakriani Sakti, Masao Utiyama
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
International Workshop on Spoken Language Translation
Note:
Pages:
42–47
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2017.iwslt-1.6/
DOI:
Bibkey:
Cite (ACL):
Ngoc-Quan Pham, Matthias Sperber, Elizabeth Salesky, Thanh-Le Ha, Jan Niehues, and Alexander Waibel. 2017. KIT’s Multilingual Neural Machine Translation systems for IWSLT 2017. In Proceedings of the 14th International Conference on Spoken Language Translation, pages 42–47, Tokyo, Japan. International Workshop on Spoken Language Translation.
Cite (Informal):
KIT’s Multilingual Neural Machine Translation systems for IWSLT 2017 (Pham et al., IWSLT 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2017.iwslt-1.6.pdf