A Study of Reinforcement Learning for Neural Machine Translation

Lijun Wu, Fei Tian, Tao Qin, Jianhuang Lai, Tie-Yan Liu


Abstract
Recent studies have shown that reinforcement learning (RL) is an effective approach for improving the performance of neural machine translation (NMT) system. However, due to its instability, successfully RL training is challenging, especially in real-world systems where deep models and large datasets are leveraged. In this paper, taking several large-scale translation tasks as testbeds, we conduct a systematic study on how to train better NMT models using reinforcement learning. We provide a comprehensive comparison of several important factors (e.g., baseline reward, reward shaping) in RL training. Furthermore, to fill in the gap that it remains unclear whether RL is still beneficial when monolingual data is used, we propose a new method to leverage RL to further boost the performance of NMT systems trained with source/target monolingual data. By integrating all our findings, we obtain competitive results on WMT14 English-German, WMT17 English-Chinese, and WMT17 Chinese-English translation tasks, especially setting a state-of-the-art performance on WMT17 Chinese-English translation task.
Anthology ID:
D18-1397
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3612–3621
Language:
URL:
https://aclanthology.org/D18-1397
DOI:
10.18653/v1/D18-1397
Bibkey:
Cite (ACL):
Lijun Wu, Fei Tian, Tao Qin, Jianhuang Lai, and Tie-Yan Liu. 2018. A Study of Reinforcement Learning for Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3612–3621, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
A Study of Reinforcement Learning for Neural Machine Translation (Wu et al., EMNLP 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/D18-1397.pdf
Video:
 https://preview.aclanthology.org/improve-issue-templates/D18-1397.mp4
Code
 apeterswu/RL4NMT