IIE’s Neural Machine Translation Systems for WMT20

Xiangpeng Wei, Ping Guo, Yunpeng Li, Xingsheng Zhang, Luxi Xing, Yue Hu


Abstract
In this paper we introduce the systems IIE submitted for the WMT20 shared task on German-French news translation. Our systems are based on the Transformer architecture with some effective improvements. Multiscale collaborative deep architecture, data selection, back translation, knowledge distillation, domain adaptation, model ensemble and re-ranking are employed and proven effective in our experiments. Our German-to-French system achieved 35.0 BLEU and ranked the second among all anonymous submissions, and our French-to-German system achieved 36.6 BLEU and ranked the fourth in all anonymous submissions.
Anthology ID:
2020.wmt-1.32
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
300–304
Language:
URL:
https://aclanthology.org/2020.wmt-1.32
DOI:
Bibkey:
Cite (ACL):
Xiangpeng Wei, Ping Guo, Yunpeng Li, Xingsheng Zhang, Luxi Xing, and Yue Hu. 2020. IIE’s Neural Machine Translation Systems for WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 300–304, Online. Association for Computational Linguistics.
Cite (Informal):
IIE’s Neural Machine Translation Systems for WMT20 (Wei et al., WMT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2020.wmt-1.32.pdf
Video:
 https://slideslive.com/38939579