HW-TSC’s Participation at WMT 2020 Quality Estimation Shared Task

Minghan Wang, Hao Yang, Hengchao Shang, Daimeng Wei, Jiaxin Guo, Lizhi Lei, Ying Qin, Shimin Tao, Shiliang Sun, Yimeng Chen, Liangyou Li


Abstract
This paper presents our work in the WMT 2020 Word and Sentence-Level Post-Editing Quality Estimation (QE) Shared Task. Our system follows standard Predictor-Estimator architecture, with a pre-trained Transformer as the Predictor, and specific classifiers and regressors as Estimators. We integrate Bottleneck Adapter Layers in the Predictor to improve the transfer learning efficiency and prevent from over-fitting. At the same time, we jointly train the word- and sentence-level tasks with a unified model with multitask learning. Pseudo-PE assisted QE (PEAQE) is proposed, resulting in significant improvements on the performance. Our submissions achieve competitive result in word/sentence-level sub-tasks for both of En-De/Zh language pairs.
Anthology ID:
2020.wmt-1.123
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1056–1061
Language:
URL:
https://aclanthology.org/2020.wmt-1.123
DOI:
Bibkey:
Cite (ACL):
Minghan Wang, Hao Yang, Hengchao Shang, Daimeng Wei, Jiaxin Guo, Lizhi Lei, Ying Qin, Shimin Tao, Shiliang Sun, Yimeng Chen, and Liangyou Li. 2020. HW-TSC’s Participation at WMT 2020 Quality Estimation Shared Task. In Proceedings of the Fifth Conference on Machine Translation, pages 1056–1061, Online. Association for Computational Linguistics.
Cite (Informal):
HW-TSC’s Participation at WMT 2020 Quality Estimation Shared Task (Wang et al., WMT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-url/2020.wmt-1.123.pdf
Video:
 https://slideslive.com/38939571