Abstract
In this paper, we describe KIT’s submission for the IWSLT 2019 shared task on text translation. Our system is based on the transformer model [1] using our in-house implementation. We augment the available training data using back-translation and employ fine-tuning for the final model. For our best results, we used a 12-layer transformer-big config- uration, achieving state-of-the-art results on the WMT2018 test set. We also experiment with student-teacher models to improve performance of smaller models.- Anthology ID:
- 2019.iwslt-1.13
- Volume:
- Proceedings of the 16th International Conference on Spoken Language Translation
- Month:
- November 2-3
- Year:
- 2019
- Address:
- Hong Kong
- Editors:
- Jan Niehues, Rolando Cattoni, Sebastian Stüker, Matteo Negri, Marco Turchi, Thanh-Le Ha, Elizabeth Salesky, Ramon Sanabria, Loic Barrault, Lucia Specia, Marcello Federico
- Venue:
- IWSLT
- SIG:
- SIGSLT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- Language:
- URL:
- https://aclanthology.org/2019.iwslt-1.13
- DOI:
- Cite (ACL):
- Felix Schneider and Alex Waibel. 2019. KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation. In Proceedings of the 16th International Conference on Spoken Language Translation, Hong Kong. Association for Computational Linguistics.
- Cite (Informal):
- KIT’s Submission to the IWSLT 2019 Shared Task on Text Translation (Schneider & Waibel, IWSLT 2019)
- PDF:
- https://preview.aclanthology.org/landing_page/2019.iwslt-1.13.pdf