IIE’s Neural Machine Translation Systems for WMT20
Xiangpeng Wei, Ping Guo, Yunpeng Li, Xingsheng Zhang, Luxi Xing, Yue Hu
Abstract
In this paper we introduce the systems IIE submitted for the WMT20 shared task on German-French news translation. Our systems are based on the Transformer architecture with some effective improvements. Multiscale collaborative deep architecture, data selection, back translation, knowledge distillation, domain adaptation, model ensemble and re-ranking are employed and proven effective in our experiments. Our German-to-French system achieved 35.0 BLEU and ranked the second among all anonymous submissions, and our French-to-German system achieved 36.6 BLEU and ranked the fourth in all anonymous submissions.- Anthology ID:
- 2020.wmt-1.32
- Volume:
- Proceedings of the Fifth Conference on Machine Translation
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 300–304
- Language:
- URL:
- https://aclanthology.org/2020.wmt-1.32
- DOI:
- Cite (ACL):
- Xiangpeng Wei, Ping Guo, Yunpeng Li, Xingsheng Zhang, Luxi Xing, and Yue Hu. 2020. IIE’s Neural Machine Translation Systems for WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 300–304, Online. Association for Computational Linguistics.
- Cite (Informal):
- IIE’s Neural Machine Translation Systems for WMT20 (Wei et al., WMT 2020)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2020.wmt-1.32.pdf