Using Context in Neural Machine Translation Training Objectives

Danielle Saunders, Felix Stahlberg, Bill Byrne


Abstract
We present Neural Machine Translation (NMT) training using document-level metrics with batch-level documents. Previous sequence-objective approaches to NMT training focus exclusively on sentence-level metrics like sentence BLEU which do not correspond to the desired evaluation metric, typically document BLEU. Meanwhile research into document-level NMT training focuses on data or model architecture rather than training procedure. We find that each of these lines of research has a clear space in it for the other, and propose merging them with a scheme that allows a document-level evaluation metric to be used in the NMT training objective. We first sample pseudo-documents from sentence samples. We then approximate the expected document BLEU gradient with Monte Carlo sampling for use as a cost function in Minimum Risk Training (MRT). This two-level sampling procedure gives NMT performance gains over sequence MRT and maximum-likelihood training. We demonstrate that training is more robust for document-level metrics than with sequence metrics. We further demonstrate improvements on NMT with TER and Grammatical Error Correction (GEC) using GLEU, both metrics used at the document level for evaluations.
Anthology ID:
2020.acl-main.693
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7764–7770
Language:
URL:
https://aclanthology.org/2020.acl-main.693
DOI:
10.18653/v1/2020.acl-main.693
Bibkey:
Cite (ACL):
Danielle Saunders, Felix Stahlberg, and Bill Byrne. 2020. Using Context in Neural Machine Translation Training Objectives. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7764–7770, Online. Association for Computational Linguistics.
Cite (Informal):
Using Context in Neural Machine Translation Training Objectives (Saunders et al., ACL 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2020.acl-main.693.pdf
Video:
 http://slideslive.com/38928974
Data
JFLEG