A Teacher-Student Framework for Zero-Resource Neural Machine Translation

Yun Chen, Yang Liu, Yong Cheng, Victor O.K. Li

[How to correct problems with metadata yourself]


Abstract
While end-to-end neural machine translation (NMT) has made remarkable progress recently, it still suffers from the data scarcity problem for low-resource language pairs and domains. In this paper, we propose a method for zero-resource NMT by assuming that parallel sentences have close probabilities of generating a sentence in a third language. Based on the assumption, our method is able to train a source-to-target NMT model (“student”) without parallel corpora available guided by an existing pivot-to-target NMT model (“teacher”) on a source-pivot parallel corpus. Experimental results show that the proposed method significantly improves over a baseline pivot-based model by +3.0 BLEU points across various language pairs.
Anthology ID:
P17-1176
Volume:
Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2017
Address:
Vancouver, Canada
Editors:
Regina Barzilay, Min-Yen Kan
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1925–1935
Language:
URL:
https://aclanthology.org/P17-1176
DOI:
10.18653/v1/P17-1176
Bibkey:
Cite (ACL):
Yun Chen, Yang Liu, Yong Cheng, and Victor O.K. Li. 2017. A Teacher-Student Framework for Zero-Resource Neural Machine Translation. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1925–1935, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
A Teacher-Student Framework for Zero-Resource Neural Machine Translation (Chen et al., ACL 2017)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/P17-1176.pdf