Multifaceted Challenge Set for Evaluating Machine Translation Performance

Xiaoyu Chen, Daimeng Wei, Zhanglin Wu, Ting Zhu, Hengchao Shang, Zongyao Li, Jiaxin Guo, Ning Xie, Lizhi Lei, Hao Yang, Yanfei Jiang


Abstract
Machine Translation Evaluation is critical to Machine Translation research, as the evaluation results reflect the effectiveness of training strategies. As a result, a fair and efficient evaluation method is necessary. Many researchers have raised questions about currently available evaluation metrics from various perspectives, and propose suggestions accordingly. However, to our knowledge, few researchers has analyzed the difficulty level of source sentence and its influence on evaluation results. This paper presents HW-TSC’s submission to the WMT23 MT Test Suites shared task. We propose a systematic approach for construing challenge sets from four aspects: word difficulty, length difficulty, grammar difficulty and model learning difficulty. We open-source two Multifaceted Challenge Sets for Zh→En and En→Zh. We also present results of participants in this year’s General MT shared task on our test sets.
Anthology ID:
2023.wmt-1.22
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
217–223
Language:
URL:
https://aclanthology.org/2023.wmt-1.22
DOI:
10.18653/v1/2023.wmt-1.22
Bibkey:
Cite (ACL):
Xiaoyu Chen, Daimeng Wei, Zhanglin Wu, Ting Zhu, Hengchao Shang, Zongyao Li, Jiaxin Guo, Ning Xie, Lizhi Lei, Hao Yang, and Yanfei Jiang. 2023. Multifaceted Challenge Set for Evaluating Machine Translation Performance. In Proceedings of the Eighth Conference on Machine Translation, pages 217–223, Singapore. Association for Computational Linguistics.
Cite (Informal):
Multifaceted Challenge Set for Evaluating Machine Translation Performance (Chen et al., WMT 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.wmt-1.22.pdf