SurreyAI 2023 Submission for the Quality Estimation Shared Task

Archchana Sindhujan, Diptesh Kanojia, Constantin Orasan, Tharindu Ranasinghe


Abstract
Quality Estimation (QE) systems are important in situations where it is necessary to assess the quality of translations, but there is no reference available. This paper describes the approach adopted by the SurreyAI team for addressing the Sentence-Level Direct Assessment shared task in WMT23. The proposed approach builds upon the TransQuest framework, exploring various autoencoder pre-trained language models within the MonoTransQuest architecture using single and ensemble settings. The autoencoder pre-trained language models employed in the proposed systems are XLMV, InfoXLM-large, and XLMR-large. The evaluation utilizes Spearman and Pearson correlation coefficients, assessing the relationship between machine-predicted quality scores and human judgments for 5 language pairs (English-Gujarati, English-Hindi, English-Marathi, English-Tamil and English-Telugu). The MonoTQ-InfoXLM-large approach emerges as a robust strategy, surpassing all other individual models proposed in this study by significantly improving over the baseline for the majority of the language pairs.
Anthology ID:
2023.wmt-1.74
Volume:
Proceedings of the Eighth Conference on Machine Translation
Month:
December
Year:
2023
Address:
Singapore
Editors:
Philipp Koehn, Barry Haddow, Tom Kocmi, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
849–855
Language:
URL:
https://aclanthology.org/2023.wmt-1.74
DOI:
10.18653/v1/2023.wmt-1.74
Bibkey:
Cite (ACL):
Archchana Sindhujan, Diptesh Kanojia, Constantin Orasan, and Tharindu Ranasinghe. 2023. SurreyAI 2023 Submission for the Quality Estimation Shared Task. In Proceedings of the Eighth Conference on Machine Translation, pages 849–855, Singapore. Association for Computational Linguistics.
Cite (Informal):
SurreyAI 2023 Submission for the Quality Estimation Shared Task (Sindhujan et al., WMT 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/dois-2013-emnlp/2023.wmt-1.74.pdf