Abstract
This paper describes Mininglamp neural machine translation systems of the WMT2021 news translation tasks. We have participated in eight directions translation tasks for news text including Chinese to/from English, Hausa to/from English, German to/from English and French to/from German. Our fundamental system was based on Transformer architecture, with wider or smaller construction for different news translation tasks. We mainly utilized the method of back-translation, knowledge distillation and fine-tuning to boost single model, while the ensemble was used to combine single models. Our final submission has ranked first for the English to/from Hausa task.- Anthology ID:
- 2021.wmt-1.25
- Volume:
- Proceedings of the Sixth Conference on Machine Translation
- Month:
- November
- Year:
- 2021
- Address:
- Online
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 260–264
- Language:
- URL:
- https://aclanthology.org/2021.wmt-1.25
- DOI:
- Cite (ACL):
- Shiyu Zhao, Xiaopu Li, Minghui Wu, and Jie Hao. 2021. The Mininglamp Machine Translation System for WMT21. In Proceedings of the Sixth Conference on Machine Translation, pages 260–264, Online. Association for Computational Linguistics.
- Cite (Informal):
- The Mininglamp Machine Translation System for WMT21 (Zhao et al., WMT 2021)
- PDF:
- https://preview.aclanthology.org/ingestion-script-update/2021.wmt-1.25.pdf