HW-TSC’s Participation at WMT 2020 Automatic Post Editing Shared Task

Hao Yang, Minghan Wang, Daimeng Wei, Hengchao Shang, Jiaxin Guo, Zongyao Li, Lizhi Lei, Ying Qin, Shimin Tao, Shiliang Sun, Yimeng Chen


Abstract
The paper presents the submission by HW-TSC in the WMT 2020 Automatic Post Editing Shared Task. We participate in the English-German and English-Chinese language pairs. Our system is built based on the Transformer pre-trained on WMT 2019 and WMT 2020 News Translation corpora, and fine-tuned on the APE corpus. Bottleneck Adapter Layers are integrated into the model to prevent over-fitting. We further collect external translations as the augmented MT candidates to improve the performance. The experiment demonstrates that pre-trained NMT models are effective when fine-tuning with the APE corpus of a limited size, and the performance can be further improved with external MT augmentation. Our system achieves competitive results on both directions in the final evaluation.
Anthology ID:
2020.wmt-1.85
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
797–802
Language:
URL:
https://aclanthology.org/2020.wmt-1.85
DOI:
Bibkey:
Cite (ACL):
Hao Yang, Minghan Wang, Daimeng Wei, Hengchao Shang, Jiaxin Guo, Zongyao Li, Lizhi Lei, Ying Qin, Shimin Tao, Shiliang Sun, and Yimeng Chen. 2020. HW-TSC’s Participation at WMT 2020 Automatic Post Editing Shared Task. In Proceedings of the Fifth Conference on Machine Translation, pages 797–802, Online. Association for Computational Linguistics.
Cite (Informal):
HW-TSC’s Participation at WMT 2020 Automatic Post Editing Shared Task (Yang et al., WMT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/author-url/2020.wmt-1.85.pdf
Video:
 https://slideslive.com/38939570
Data
eSCAPE