Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English

Maha Elbayad, Michael Ustaszewski, Emmanuelle Esperança-Rodier, Francis Brunet-Manquat, Jakob Verbeek, Laurent Besacier


Abstract
We conduct in this work an evaluation study comparing offline and online neural machine translation architectures. Two sequence-to-sequence models: convolutional Pervasive Attention (Elbayad et al. 2018) and attention-based Transformer (Vaswani et al. 2017) are considered. We investigate, for both architectures, the impact of online decoding constraints on the translation quality through a carefully designed human evaluation on English-German and German-English language pairs, the latter being particularly sensitive to latency constraints. The evaluation results allow us to identify the strengths and shortcomings of each model when we shift to the online setup.
Anthology ID:
2020.coling-main.443
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5047–5058
Language:
URL:
https://aclanthology.org/2020.coling-main.443
DOI:
10.18653/v1/2020.coling-main.443
Bibkey:
Cite (ACL):
Maha Elbayad, Michael Ustaszewski, Emmanuelle Esperança-Rodier, Francis Brunet-Manquat, Jakob Verbeek, and Laurent Besacier. 2020. Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5047–5058, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Online Versus Offline NMT Quality: An In-depth Analysis on English-German and German-English (Elbayad et al., COLING 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.443.pdf
Code
 elbayadm/OnlineMT-Evaluation