Trivial Transfer Learning for Low-Resource Neural Machine Translation

Tom Kocmi, Ondřej Bojar


Abstract
Transfer learning has been proven as an effective technique for neural machine translation under low-resource conditions. Existing methods require a common target language, language relatedness, or specific training tricks and regimes. We present a simple transfer learning method, where we first train a “parent” model for a high-resource language pair and then continue the training on a low-resource pair only by replacing the training corpus. This “child” model performs significantly better than the baseline trained for low-resource pair only. We are the first to show this for targeting different languages, and we observe the improvements even for unrelated languages with different alphabets.
Anthology ID:
W18-6325
Volume:
Proceedings of the Third Conference on Machine Translation: Research Papers
Month:
October
Year:
2018
Address:
Brussels, Belgium
Venues:
EMNLP | WMT | WS
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
244–252
Language:
URL:
https://aclanthology.org/W18-6325
DOI:
10.18653/v1/W18-6325
Bibkey:
Cite (ACL):
Tom Kocmi and Ondřej Bojar. 2018. Trivial Transfer Learning for Low-Resource Neural Machine Translation. In Proceedings of the Third Conference on Machine Translation: Research Papers, pages 244–252, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Trivial Transfer Learning for Low-Resource Neural Machine Translation (Kocmi & Bojar, 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/W18-6325.pdf