Adobe AMPS’s Submission for Very Low Resource Supervised Translation Task at WMT20

Keshaw Singh


Abstract
In this paper, we describe our systems submitted to the very low resource supervised translation task at WMT20. We participate in both translation directions for Upper Sorbian-German language pair. Our primary submission is a subword-level Transformer-based neural machine translation model trained on original training bitext. We also conduct several experiments with backtranslation using limited monolingual data in our post-submission work and include our results for the same. In one such experiment, we observe jumps of up to 2.6 BLEU points over the primary system by pretraining on a synthetic, backtranslated corpus followed by fine-tuning on the original parallel training data.
Anthology ID:
2020.wmt-1.136
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1144–1149
Language:
URL:
https://aclanthology.org/2020.wmt-1.136
DOI:
Bibkey:
Cite (ACL):
Keshaw Singh. 2020. Adobe AMPS’s Submission for Very Low Resource Supervised Translation Task at WMT20. In Proceedings of the Fifth Conference on Machine Translation, pages 1144–1149, Online. Association for Computational Linguistics.
Cite (Informal):
Adobe AMPS’s Submission for Very Low Resource Supervised Translation Task at WMT20 (Singh, WMT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/proper-vol2-ingestion/2020.wmt-1.136.pdf
Video:
 https://slideslive.com/38939621