Abstract
This paper describes the NoahNMT system submitted to the WMT 2021 shared task of Very Low Resource Supervised Machine Translation. The system is a standard Transformer model equipped with our recent technique of dual transfer. It also employs widely used techniques that are known to be helpful for neural machine translation, including iterative back-translation, selected finetuning, and ensemble. The final submission achieves the top BLEU for three translation directions.- Anthology ID:
- 2021.wmt-1.108
- Volume:
- Proceedings of the Sixth Conference on Machine Translation
- Month:
- November
- Year:
- 2021
- Address:
- Online
- Editors:
- Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1009–1013
- Language:
- URL:
- https://aclanthology.org/2021.wmt-1.108
- DOI:
- Cite (ACL):
- Meng Zhang, Minghao Wu, Pengfei Li, Liangyou Li, and Qun Liu. 2021. NoahNMT at WMT 2021: Dual Transfer for Very Low Resource Supervised Machine Translation. In Proceedings of the Sixth Conference on Machine Translation, pages 1009–1013, Online. Association for Computational Linguistics.
- Cite (Informal):
- NoahNMT at WMT 2021: Dual Transfer for Very Low Resource Supervised Machine Translation (Zhang et al., WMT 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-3/2021.wmt-1.108.pdf