Kweonwoo Jung
2021
Efficient Inference for Multilingual Neural Machine Translation
Alexandre Berard
|
Dain Lee
|
Stephane Clinchant
|
Kweonwoo Jung
|
Vassilina Nikoulina
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Multilingual NMT has become an attractive solution for MT deployment in production. But to match bilingual quality, it comes at the cost of larger and slower models. In this work, we consider several ways to make multilingual NMT faster at inference without degrading its quality. We experiment with several “light decoder” architectures in two 20-language multi-parallel settings: small-scale on TED Talks and large-scale on ParaCrawl. Our experiments demonstrate that combining a shallow decoder with vocabulary filtering leads to almost 2 times faster inference with no loss in translation quality. We validate our findings with BLEU and chrF (on 380 language pairs), robustness evaluation and human evaluation.
Findings of the WMT Shared Task on Machine Translation Using Terminologies
Md Mahfuz Ibn Alam
|
Ivana Kvapilíková
|
Antonios Anastasopoulos
|
Laurent Besacier
|
Georgiana Dinu
|
Marcello Federico
|
Matthias Gallé
|
Kweonwoo Jung
|
Philipp Koehn
|
Vassilina Nikoulina
Proceedings of the Sixth Conference on Machine Translation
Language domains that require very careful use of terminology are abundant and reflect a significant part of the translation industry. In this work we introduce a benchmark for evaluating the quality and consistency of terminology translation, focusing on the medical (and COVID-19 specifically) domain for five language pairs: English to French, Chinese, Russian, and Korean, as well as Czech to German. We report the descriptions and results of the participating systems, commenting on the need for further research efforts towards both more adequate handling of terminologies as well as towards a proper formulation and evaluation of the task.