Optimizing Transformer for Low-Resource Neural Machine Translation

Ali Araabi, Christof Monz


Abstract
Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation. While the Transformer model has achieved significant improvements for many language pairs and has become the de facto mainstream architecture, its capability under low-resource conditions has not been fully investigated yet. Our experiments on different subsets of the IWSLT14 training data show that the effectiveness of Transformer under low-resource conditions is highly dependent on the hyper-parameter settings. Our experiments show that using an optimized Transformer for low-resource conditions improves the translation quality up to 7.3 BLEU points compared to using the Transformer default settings.
Anthology ID:
2020.coling-main.304
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3429–3435
Language:
URL:
https://aclanthology.org/2020.coling-main.304
DOI:
10.18653/v1/2020.coling-main.304
Bibkey:
Cite (ACL):
Ali Araabi and Christof Monz. 2020. Optimizing Transformer for Low-Resource Neural Machine Translation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3429–3435, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Optimizing Transformer for Low-Resource Neural Machine Translation (Araabi & Monz, COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.304.pdf