Hanming Wu
2022
BJTU-Toshiba’s Submission to WMT22 Quality Estimation Shared Task
Hui Huang
|
Hui Di
|
Chunyou Li
|
Hanming Wu
|
Kazushige Ouchi
|
Yufeng Chen
|
Jian Liu
|
Jinan Xu
Proceedings of the Seventh Conference on Machine Translation (WMT)
This paper presents the BJTU-Toshiba joint submission for WMT 2022 quality estimation shared task. We only participate in Task 1 (quality prediction) of the shared task, focusing on the sentence-level MQM prediction. The techniques we experimented with include the integration of monolingual language models and the pre-finetuning of pre-trained representations. We tried two styles of pre-finetuning, namely Translation Language Modeling and Replaced Token Detection. We demonstrate the competitiveness of our system compared to the widely adopted XLM-RoBERTa baseline. Our system is also the top-ranking system on the Sentence-level MQM Prediction for the English-German language pairs.
Search
Co-authors
- Hui Huang 1
- Hui Di 1
- Chunyou Li 1
- Kazushige Ouchi 1
- Yufeng Chen 1
- show all...
Venues
- wmt1