Abstract
This paper presents a description of CUNI systems submitted to the WMT20 task on unsupervised and very low-resource supervised machine translation between German and Upper Sorbian. We experimented with training on synthetic data and pre-training on a related language pair. In the fully unsupervised scenario, we achieved 25.5 and 23.7 BLEU translating from and into Upper Sorbian, respectively. Our low-resource systems relied on transfer learning from German-Czech parallel data and achieved 57.4 BLEU and 56.1 BLEU, which is an improvement of 10 BLEU points over the baseline trained only on the available small German-Upper Sorbian parallel corpus.- Anthology ID:
- 2020.wmt-1.133
- Volume:
- Proceedings of the Fifth Conference on Machine Translation
- Month:
- November
- Year:
- 2020
- Address:
- Online
- Editors:
- Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
- Venue:
- WMT
- SIG:
- SIGMT
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 1123–1128
- Language:
- URL:
- https://aclanthology.org/2020.wmt-1.133
- DOI:
- Cite (ACL):
- Ivana Kvapilíková, Tom Kocmi, and Ondřej Bojar. 2020. CUNI Systems for the Unsupervised and Very Low Resource Translation Task in WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 1123–1128, Online. Association for Computational Linguistics.
- Cite (Informal):
- CUNI Systems for the Unsupervised and Very Low Resource Translation Task in WMT20 (Kvapilíková et al., WMT 2020)
- PDF:
- https://preview.aclanthology.org/emnlp-22-attachments/2020.wmt-1.133.pdf