DFKI-MLT System Description for the WMT18 Automatic Post-editing Task

Daria Pylypenko, Raphael Rubino


Abstract
This paper presents the Automatic Post-editing (APE) systems submitted by the DFKI-MLT group to the WMT’18 APE shared task. Three monolingual neural sequence-to-sequence APE systems were trained using target-language data only: one using an attentional recurrent neural network architecture and two using the attention-only (transformer) architecture. The training data was composed of machine translated (MT) output used as source to the APE model aligned with their manually post-edited version or reference translation as target. We made use of the provided training sets only and trained APE models applicable to phrase-based and neural MT outputs. Results show better performances reached by the attention-only model over the recurrent one, significant improvement over the baseline when post-editing phrase-based MT output but degradation when applied to neural MT output.
Anthology ID:
W18-6469
Volume:
Proceedings of the Third Conference on Machine Translation: Shared Task Papers
Month:
October
Year:
2018
Address:
Belgium, Brussels
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
836–839
Language:
URL:
https://aclanthology.org/W18-6469
DOI:
10.18653/v1/W18-6469
Bibkey:
Cite (ACL):
Daria Pylypenko and Raphael Rubino. 2018. DFKI-MLT System Description for the WMT18 Automatic Post-editing Task. In Proceedings of the Third Conference on Machine Translation: Shared Task Papers, pages 836–839, Belgium, Brussels. Association for Computational Linguistics.
Cite (Informal):
DFKI-MLT System Description for the WMT18 Automatic Post-editing Task (Pylypenko & Rubino, WMT 2018)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/W18-6469.pdf
Data
eSCAPE