HW-TSC’s Participation at WMT 2021 Quality Estimation Shared Task

Yimeng Chen, Chang Su, Yingtao Zhang, Yuxia Wang, Xiang Geng, Hao Yang, Shimin Tao, Guo Jiaxin, Wang Minghan, Min Zhang, Yujia Liu, Shujian Huang


Abstract
This paper presents our work in WMT 2021 Quality Estimation (QE) Shared Task. We participated in all of the three sub-tasks, including Sentence-Level Direct Assessment (DA) task, Word and Sentence-Level Post-editing Effort task and Critical Error Detection task, in all language pairs. Our systems employ the framework of Predictor-Estimator, concretely with a pre-trained XLM-Roberta as Predictor and task-specific classifier or regressor as Estimator. For all tasks, we improve our systems by incorporating post-edit sentence or additional high-quality translation sentence in the way of multitask learning or encoding it with predictors directly. Moreover, in zero-shot setting, our data augmentation strategy based on Monte-Carlo Dropout brings up significant improvement on DA sub-task. Notably, our submissions achieve remarkable results over all tasks.
Anthology ID:
2021.wmt-1.92
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
890–896
Language:
URL:
https://aclanthology.org/2021.wmt-1.92
DOI:
Bibkey:
Cite (ACL):
Yimeng Chen, Chang Su, Yingtao Zhang, Yuxia Wang, Xiang Geng, Hao Yang, Shimin Tao, Guo Jiaxin, Wang Minghan, Min Zhang, Yujia Liu, and Shujian Huang. 2021. HW-TSC’s Participation at WMT 2021 Quality Estimation Shared Task. In Proceedings of the Sixth Conference on Machine Translation, pages 890–896, Online. Association for Computational Linguistics.
Cite (Informal):
HW-TSC’s Participation at WMT 2021 Quality Estimation Shared Task (Chen et al., WMT 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.wmt-1.92.pdf