Multi-encoder Transformer Network for Automatic Post-Editing

Jaehun Shin, Jong-Hyeok Lee


Abstract
This paper describes the POSTECH’s submission to the WMT 2018 shared task on Automatic Post-Editing (APE). We propose a new neural end-to-end post-editing model based on the transformer network. We modified the encoder-decoder attention to reflect the relation between the machine translation output, the source and the post-edited translation in APE problem. Experiments on WMT17 English-German APE data set show an improvement in both TER and BLEU score over the best result of WMT17 APE shared task. Our primary submission achieves -4.52 TER and +6.81 BLEU score on PBSMT task and -0.13 TER and +0.40 BLEU score for NMT task compare to the baseline.
Anthology ID:
W18-6470
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Venues:
EMNLP | WMT | WS
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
840–845
Language:
URL:
https://aclanthology.org/W18-6470
DOI:
10.18653/v1/W18-6470
Bibkey:
Cite (ACL):
Jaehun Shin and Jong-Hyeok Lee. 2018. Multi-encoder Transformer Network for Automatic Post-Editing. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 840–845, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
Multi-encoder Transformer Network for Automatic Post-Editing (Shin & Lee, 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/update-css-js/W18-6470.pdf