Reference Language based Unsupervised Neural Machine Translation

Zuchao Li, Hai Zhao, Rui Wang, Masao Utiyama, Eiichiro Sumita


Abstract
Exploiting a common language as an auxiliary for better translation has a long tradition in machine translation and lets supervised learning-based machine translation enjoy the enhancement delivered by the well-used pivot language in the absence of a source language to target language parallel corpus. The rise of unsupervised neural machine translation (UNMT) almost completely relieves the parallel corpus curse, though UNMT is still subject to unsatisfactory performance due to the vagueness of the clues available for its core back-translation training. Further enriching the idea of pivot translation by extending the use of parallel corpora beyond the source-target paradigm, we propose a new reference language-based framework for UNMT, RUNMT, in which the reference language only shares a parallel corpus with the source, but this corpus still indicates a signal clear enough to help the reconstruction training of UNMT through a proposed reference agreement mechanism. Experimental results show that our methods improve the quality of UNMT over that of a strong baseline that uses only one auxiliary language, demonstrating the usefulness of the proposed reference language-based UNMT and establishing a good start for the community.
Anthology ID:
2020.findings-emnlp.371
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4151–4162
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.371
DOI:
10.18653/v1/2020.findings-emnlp.371
Bibkey:
Cite (ACL):
Zuchao Li, Hai Zhao, Rui Wang, Masao Utiyama, and Eiichiro Sumita. 2020. Reference Language based Unsupervised Neural Machine Translation. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4151–4162, Online. Association for Computational Linguistics.
Cite (Informal):
Reference Language based Unsupervised Neural Machine Translation (Li et al., Findings 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-2024-clasp/2020.findings-emnlp.371.pdf
Optional supplementary material:
 2020.findings-emnlp.371.OptionalSupplementaryMaterial.zip
Code
 bcmi220/runmt