Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task

Brian Tubay, Marta R. Costa-jussà


Abstract
The Transformer architecture has become the state-of-the-art in Machine Translation. This model, which relies on attention-based mechanisms, has outperformed previous neural machine translation architectures in several tasks. In this system description paper, we report details of training neural machine translation with multi-source Romance languages with the Transformer model and in the evaluation frame of the biomedical WMT 2018 task. Using multi-source languages from the same family allows improvements of over 6 BLEU points.
Anthology ID:
W18-6449
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Editors:
Ondřej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana Neves, Matt Post, Lucia Specia, Marco Turchi, Karin Verspoor
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
667–670
Language:
URL:
https://aclanthology.org/W18-6449
DOI:
10.18653/v1/W18-6449
Bibkey:
Cite (ACL):
Brian Tubay and Marta R. Costa-jussà. 2018. Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 667–670, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation with the Transformer and Multi-Source Romance Languages for the Biomedical WMT 2018 task (Tubay & Costa-jussà, WMT 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/W18-6449.pdf