A Transformer-Based Multi-Source Automatic Post-Editing System

Santanu Pal, Nico Herbig, Antonio Krüger, Josef van Genabith


Abstract
This paper presents our English–German Automatic Post-Editing (APE) system submitted to the APE Task organized at WMT 2018 (Chatterjee et al., 2018). The proposed model is an extension of the transformer architecture: two separate self-attention-based encoders encode the machine translation output (mt) and the source (src), followed by a joint encoder that attends over a combination of these two encoded sequences (encsrc and encmt) for generating the post-edited sentence. We compare this multi-source architecture (i.e, {src, mt} → pe) to a monolingual transformer (i.e., mt → pe) model and an ensemble combining the multi-source {src, mt} → pe and single-source mt → pe models. For both the PBSMT and the NMT task, the ensemble yields the best results, followed by the multi-source model and last the single-source approach. Our best model, the ensemble, achieves a BLEU score of 66.16 and 74.22 for the PBSMT and NMT task, respectively.
Anthology ID:
W18-6468
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
827–835
Language:
URL:
https://aclanthology.org/W18-6468
DOI:
10.18653/v1/W18-6468
Bibkey:
Cite (ACL):
Santanu Pal, Nico Herbig, Antonio Krüger, and Josef van Genabith. 2018. A Transformer-Based Multi-Source Automatic Post-Editing System. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 827–835, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
A Transformer-Based Multi-Source Automatic Post-Editing System (Pal et al., WMT 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/W18-6468.pdf
Data
WMT 2016eSCAPE