A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation

Shuo Ren, Yu Wu, Shujie Liu, Ming Zhou, Shuai Ma


Abstract
The commonly used framework for unsupervised machine translation builds initial translation models of both translation directions, and then performs iterative back-translation to jointly boost their translation performance. The initialization stage is very important since bad initialization may wrongly squeeze the search space, and too much noise introduced in this stage may hurt the final performance. In this paper, we propose a novel retrieval and rewriting based method to better initialize unsupervised translation models. We first retrieve semantically comparable sentences from monolingual corpora of two languages and then rewrite the target side to minimize the semantic gap between the source and retrieved targets with a designed rewriting model. The rewritten sentence pairs are used to initialize SMT models which are used to generate pseudo data for two NMT models, followed by the iterative back-translation. Experiments show that our method can build better initial unsupervised translation models and improve the final translation performance by over 4 BLEU scores. Our code is released at https://github.com/Imagist-Shuo/RRforUNMT.git.
Anthology ID:
2020.acl-main.320
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3498–3504
Language:
URL:
https://aclanthology.org/2020.acl-main.320
DOI:
10.18653/v1/2020.acl-main.320
Bibkey:
Cite (ACL):
Shuo Ren, Yu Wu, Shujie Liu, Ming Zhou, and Shuai Ma. 2020. A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 3498–3504, Online. Association for Computational Linguistics.
Cite (Informal):
A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation (Ren et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/starsem-semeval-split/2020.acl-main.320.pdf
Video:
 http://slideslive.com/38928942
Code
 Imagist-Shuo/RRforUNMT