Reducing Word Omission Errors in Neural Machine Translation: A Contrastive Learning Approach

Zonghan Yang, Yong Cheng, Yang Liu, Maosong Sun

[How to correct problems with metadata yourself]


Abstract
While neural machine translation (NMT) has achieved remarkable success, NMT systems are prone to make word omission errors. In this work, we propose a contrastive learning approach to reducing word omission errors in NMT. The basic idea is to enable the NMT model to assign a higher probability to a ground-truth translation and a lower probability to an erroneous translation, which is automatically constructed from the ground-truth translation by omitting words. We design different types of negative examples depending on the number of omitted words, word frequency, and part of speech. Experiments on Chinese-to-English, German-to-English, and Russian-to-English translation tasks show that our approach is effective in reducing word omission errors and achieves better translation performance than three baseline methods.
Anthology ID:
P19-1623
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6191–6196
Language:
URL:
https://aclanthology.org/P19-1623
DOI:
10.18653/v1/P19-1623
Bibkey:
Cite (ACL):
Zonghan Yang, Yong Cheng, Yang Liu, and Maosong Sun. 2019. Reducing Word Omission Errors in Neural Machine Translation: A Contrastive Learning Approach. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 6191–6196, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Reducing Word Omission Errors in Neural Machine Translation: A Contrastive Learning Approach (Yang et al., ACL 2019)
Copy Citation:
PDF:
https://preview.aclanthology.org/teach-a-man-to-fish/P19-1623.pdf