Quantitative Analysis of Post-Editing Effort Indicators for NMT

Sergi Alvarez, Antoni Oliver, Toni Badia


Abstract
The recent improvements in machine translation (MT) have boosted the use of post-editing (PE) in the translation industry. A new machine translation paradigm, neural machine translation (NMT), is displacing its corpus-based predecessor, statistical machine translation (SMT), in the translation workflows currently implemented because it usually increases the fluency and accuracy of the MT output. However, usual automatic measurements do not always indicate the quality of the MT output and there is still no clear correlation between PE effort and productivity. We present a quantitative analysis of different PE effort indicators for two NMT systems (transformer and seq2seq) for English-Spanish in-domain medical documents. We compare both systems and study the correlation between PE time and other scores. Results show less PE effort for the transformer NMT model and a high correlation between PE time and keystrokes.
Anthology ID:
2020.eamt-1.44
Volume:
Proceedings of the 22nd Annual Conference of the European Association for Machine Translation
Month:
November
Year:
2020
Address:
Lisboa, Portugal
Editors:
André Martins, Helena Moniz, Sara Fumega, Bruno Martins, Fernando Batista, Luisa Coheur, Carla Parra, Isabel Trancoso, Marco Turchi, Arianna Bisazza, Joss Moorkens, Ana Guerberof, Mary Nurminen, Lena Marg, Mikel L. Forcada
Venue:
EAMT
SIG:
Publisher:
European Association for Machine Translation
Note:
Pages:
411–420
Language:
URL:
https://aclanthology.org/2020.eamt-1.44
DOI:
Bibkey:
Cite (ACL):
Sergi Alvarez, Antoni Oliver, and Toni Badia. 2020. Quantitative Analysis of Post-Editing Effort Indicators for NMT. In Proceedings of the 22nd Annual Conference of the European Association for Machine Translation, pages 411–420, Lisboa, Portugal. European Association for Machine Translation.
Cite (Informal):
Quantitative Analysis of Post-Editing Effort Indicators for NMT (Alvarez et al., EAMT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2020.eamt-1.44.pdf