KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation

Felix Schneider, Alex Waibel


Abstract
In this paper, we describe KIT’s submission for the IWSLT 2019 shared task on text translation. Our system is based on the transformer model [1] using our in-house implementation. We augment the available training data using back-translation and employ fine-tuning for the final model. For our best results, we used a 12-layer transformer-big config- uration, achieving state-of-the-art results on the WMT2018 test set. We also experiment with student-teacher models to improve performance of smaller models.
Anthology ID:
2019.iwslt-1.13
Volume:
Proceedings of the 16th International Conference on Spoken Language Translation
Month:
November 2-3
Year:
2019
Address:
Hong Kong
Venue:
IWSLT
SIG:
SIGSLT
Publisher:
Association for Computational Linguistics
Note:
Pages:
Language:
URL:
https://aclanthology.org/2019.iwslt-1.13
DOI:
Bibkey:
Cite (ACL):
Felix Schneider and Alex Waibel. 2019. KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation. In Proceedings of the 16th International Conference on Spoken Language Translation, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation (Schneider & Waibel, IWSLT 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2019.iwslt-1.13.pdf