Can Automatic Post-Editing Improve NMT?

Shamil Chollampatt, Raymond Hendy Susanto, Liling Tan, Ewa Szymanska


Abstract
Automatic post-editing (APE) aims to improve machine translations, thereby reducing human post-editing effort. APE has had notable success when used with statistical machine translation (SMT) systems but has not been as successful over neural machine translation (NMT) systems. This has raised questions on the relevance of APE task in the current scenario. However, the training of APE models has been heavily reliant on large-scale artificial corpora combined with only limited human post-edited data. We hypothesize that APE models have been underperforming in improving NMT translations due to the lack of adequate supervision. To ascertain our hypothesis, we compile a larger corpus of human post-edits of English to German NMT. We empirically show that a state-of-art neural APE model trained on this corpus can significantly improve a strong in-domain NMT system, challenging the current understanding in the field. We further investigate the effects of varying training data sizes, using artificial training data, and domain specificity for the APE task. We release this new corpus under CC BY-NC-SA 4.0 license at https://github.com/shamilcm/pedra.
Anthology ID:
2020.emnlp-main.217
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2736–2746
Language:
URL:
https://aclanthology.org/2020.emnlp-main.217
DOI:
10.18653/v1/2020.emnlp-main.217
Bibkey:
Cite (ACL):
Shamil Chollampatt, Raymond Hendy Susanto, Liling Tan, and Ewa Szymanska. 2020. Can Automatic Post-Editing Improve NMT?. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2736–2746, Online. Association for Computational Linguistics.
Cite (Informal):
Can Automatic Post-Editing Improve NMT? (Chollampatt et al., EMNLP 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.emnlp-main.217.pdf
Video:
 https://slideslive.com/38938800
Code
 shamilcm/pedra
Data
SubEditseSCAPE