Automating Behavioral Testing in Machine Translation

Javier Ferrando, Matthias Sperber, Hendra Setiawan, Dominic Telaar, Saša Hasan


Abstract
Behavioral testing in NLP allows fine-grained evaluation of systems by examining their linguistic capabilities through the analysis of input-output behavior. Unfortunately, existing work on behavioral testing in Machine Translation (MT) is currently restricted to largely handcrafted tests covering a limited range of capabilities and languages. To address this limitation, we propose to use Large Language Models (LLMs) to generate a diverse set of source sentences tailored to test the behavior of MT models in a range of situations. We can then verify whether the MT model exhibits the expected behavior through matching candidate sets that are also generated using LLMs. Our approach aims to make behavioral testing of MT systems practical while requiring only minimal human effort. In our experiments, we apply our proposed evaluation framework to assess multiple available MT systems, revealing that while in general pass-rates follow the trends observable from traditional accuracy-based metrics, our method was able to uncover several important differences and potential bugs that go unnoticed when relying only on accuracy.
Anthology ID:
2023.wmt-1.97
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1014–1030
Language:
URL:
https://aclanthology.org/2023.wmt-1.97
DOI:
10.18653/v1/2023.wmt-1.97
Bibkey:
Cite (ACL):
Javier Ferrando, Matthias Sperber, Hendra Setiawan, Dominic Telaar, and Saša Hasan. 2023. Automating Behavioral Testing in Machine Translation. In Proceedings of the Eighth Conference on Machine Translation, pages 1014–1030, Singapore. Association for Computational Linguistics.
Cite (Informal):
Automating Behavioral Testing in Machine Translation (Ferrando et al., WMT 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.wmt-1.97.pdf