Multilingual Neural Machine Translation with Task-Specific Attention

Graeme Blackwood, Miguel Ballesteros, Todd Ward


Abstract
Multilingual machine translation addresses the task of translating between multiple source and target languages. We propose task-specific attention models, a simple but effective technique for improving the quality of sequence-to-sequence neural multilingual translation. Our approach seeks to retain as much of the parameter sharing generalization of NMT models as possible, while still allowing for language-specific specialization of the attention model to a particular language-pair or task. Our experiments on four languages of the Europarl corpus show that using a target-specific model of attention provides consistent gains in translation quality for all possible translation directions, compared to a model in which all parameters are shared. We observe improved translation quality even in the (extreme) low-resource zero-shot translation directions for which the model never saw explicitly paired parallel data.
Anthology ID:
C18-1263
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3112–3122
Language:
URL:
https://aclanthology.org/C18-1263
DOI:
Bibkey:
Cite (ACL):
Graeme Blackwood, Miguel Ballesteros, and Todd Ward. 2018. Multilingual Neural Machine Translation with Task-Specific Attention. In Proceedings of the 27th International Conference on Computational Linguistics, pages 3112–3122, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Multilingual Neural Machine Translation with Task-Specific Attention (Blackwood et al., COLING 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/C18-1263.pdf