Online Learning Meets Machine Translation Evaluation: Finding the Best Systems with the Least Human Effort

Vânia Mendonça, Ricardo Rei, Luisa Coheur, Alberto Sardinha, Ana Lúcia Santos


Abstract
In Machine Translation, assessing the quality of a large amount of automatic translations can be challenging. Automatic metrics are not reliable when it comes to high performing systems. In addition, resorting to human evaluators can be expensive, especially when evaluating multiple systems. To overcome the latter challenge, we propose a novel application of online learning that, given an ensemble of Machine Translation systems, dynamically converges to the best systems, by taking advantage of the human feedback available. Our experiments on WMT’19 datasets show that our online approach quickly converges to the top-3 ranked systems for the language pairs considered, despite the lack of human feedback for many translations.
Anthology ID:
2021.acl-long.242
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3105–3117
Language:
URL:
https://aclanthology.org/2021.acl-long.242
DOI:
10.18653/v1/2021.acl-long.242
Bibkey:
Cite (ACL):
Vânia Mendonça, Ricardo Rei, Luisa Coheur, Alberto Sardinha, and Ana Lúcia Santos. 2021. Online Learning Meets Machine Translation Evaluation: Finding the Best Systems with the Least Human Effort. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 3105–3117, Online. Association for Computational Linguistics.
Cite (Informal):
Online Learning Meets Machine Translation Evaluation: Finding the Best Systems with the Least Human Effort (Mendonça et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-acl-2023-videos/2021.acl-long.242.pdf
Optional supplementary material:
 2021.acl-long.242.OptionalSupplementaryMaterial.zip
Video:
 https://preview.aclanthology.org/ingest-acl-2023-videos/2021.acl-long.242.mp4
Code
 vania-mendonca/MTOL