Zonghan Yang
2021
Alternated Training with Synthetic and Authentic Data for Neural Machine Translation
Rui Jiao
|
Zonghan Yang
|
Maosong Sun
|
Yang Liu
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
2019
Reducing Word Omission Errors in Neural Machine Translation: A Contrastive Learning Approach
Zonghan Yang
|
Yong Cheng
|
Yang Liu
|
Maosong Sun
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
While neural machine translation (NMT) has achieved remarkable success, NMT systems are prone to make word omission errors. In this work, we propose a contrastive learning approach to reducing word omission errors in NMT. The basic idea is to enable the NMT model to assign a higher probability to a ground-truth translation and a lower probability to an erroneous translation, which is automatically constructed from the ground-truth translation by omitting words. We design different types of negative examples depending on the number of omitted words, word frequency, and part of speech. Experiments on Chinese-to-English, German-to-English, and Russian-to-English translation tasks show that our approach is effective in reducing word omission errors and achieves better translation performance than three baseline methods.
Search