Contrastive Learning for Context-aware Neural Machine Translation Using Coreference Information

Yongkeun Hwang, Hyeongu Yun, Kyomin Jung


Abstract
Context-aware neural machine translation (NMT) incorporates contextual information of surrounding texts, that can improve the translation quality of document-level machine translation. Many existing works on context-aware NMT have focused on developing new model architectures for incorporating additional contexts and have shown some promising results. However, most of existing works rely on cross-entropy loss, resulting in limited use of contextual information. In this paper, we propose CorefCL, a novel data augmentation and contrastive learning scheme based on coreference between the source and contextual sentences. By corrupting automatically detected coreference mentions in the contextual sentence, CorefCL can train the model to be sensitive to coreference inconsistency. We experimented with our method on common context-aware NMT models and two document-level translation tasks. In the experiments, our method consistently improved BLEU of compared models on English-German and English-Korean tasks. We also show that our method significantly improves coreference resolution in the English-German contrastive test suite.
Anthology ID:
2021.wmt-1.121
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1135–1144
Language:
URL:
https://aclanthology.org/2021.wmt-1.121
DOI:
Bibkey:
Cite (ACL):
Yongkeun Hwang, Hyeongu Yun, and Kyomin Jung. 2021. Contrastive Learning for Context-aware Neural Machine Translation Using Coreference Information. In Proceedings of the Sixth Conference on Machine Translation, pages 1135–1144, Online. Association for Computational Linguistics.
Cite (Informal):
Contrastive Learning for Context-aware Neural Machine Translation Using Coreference Information (Hwang et al., WMT 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2021.wmt-1.121.pdf
Video:
 https://preview.aclanthology.org/auto-file-uploads/2021.wmt-1.121.mp4