WeChat Neural Machine Translation Systems for WMT20
Fandong Meng, Jianhao Yan, Yijin Liu, Yuan Gao, Xianfeng Zeng, Qinsong Zeng, Peng Li, Ming Chen, Jie Zhou, Sifan Liu, Hao Zhou
Abstract
We participate in the WMT 2020 shared newstranslation task on Chinese→English. Our system is based on the Transformer (Vaswaniet al., 2017a) with effective variants and the DTMT (Meng and Zhang, 2019) architecture. In our experiments, we employ data selection, several synthetic data generation approaches (i.e., back-translation, knowledge distillation, and iterative in-domain knowledge transfer), advanced finetuning approaches and self-bleu based model ensemble. Our constrained Chinese→English system achieves 36.9 case-sensitive BLEU score, which is thehighest among all submissions.- Anthology ID:
- 2020.wmt-1.24
- Volume:
- Proceedings of the Fifth Conference on Machine Translation
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 239–247
- Language:
- URL:
- https://aclanthology.org/2020.wmt-1.24
- DOI:
- Cite (ACL):
- Fandong Meng, Jianhao Yan, Yijin Liu, Yuan Gao, Xianfeng Zeng, Qinsong Zeng, Peng Li, Ming Chen, Jie Zhou, Sifan Liu, and Hao Zhou. 2020. WeChat Neural Machine Translation Systems for WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 239–247, Online. Association for Computational Linguistics.
- Cite (Informal):
- WeChat Neural Machine Translation Systems for WMT20 (Meng et al., WMT 2020)
- PDF:
- https://preview.aclanthology.org/nodalida-main-page/2020.wmt-1.24.pdf