Pre-training via Leveraging Assisting Languages for Neural Machine Translation

Haiyue Song, Raj Dabre, Zhuoyuan Mao, Fei Cheng, Sadao Kurohashi, Eiichiro Sumita


Abstract
Sequence-to-sequence (S2S) pre-training using large monolingual data is known to improve performance for various S2S NLP tasks. However, large monolingual corpora might not always be available for the languages of interest (LOI). Thus, we propose to exploit monolingual corpora of other languages to complement the scarcity of monolingual corpora for the LOI. We utilize script mapping (Chinese to Japanese) to increase the similarity (number of cognates) between the monolingual corpora of helping languages and LOI. An empirical case study of low-resource Japanese-English neural machine translation (NMT) reveals that leveraging large Chinese and French monolingual corpora can help overcome the shortage of Japanese and English monolingual corpora, respectively, for S2S pre-training. Using only Chinese and French monolingual corpora, we were able to improve Japanese-English translation quality by up to 8.5 BLEU in low-resource scenarios.
Anthology ID:
2020.acl-srw.37
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2020
Address:
Online
Editors:
Shruti Rijhwani, Jiangming Liu, Yizhong Wang, Rotem Dror
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
279–285
Language:
URL:
https://aclanthology.org/2020.acl-srw.37
DOI:
10.18653/v1/2020.acl-srw.37
Bibkey:
Cite (ACL):
Haiyue Song, Raj Dabre, Zhuoyuan Mao, Fei Cheng, Sadao Kurohashi, and Eiichiro Sumita. 2020. Pre-training via Leveraging Assisting Languages for Neural Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 279–285, Online. Association for Computational Linguistics.
Cite (Informal):
Pre-training via Leveraging Assisting Languages for Neural Machine Translation (Song et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.acl-srw.37.pdf
Software:
 2020.acl-srw.37.Software.zip
Video:
 http://slideslive.com/38928648