Berkan Hiziroglu


2020

pdf
Low-Resource Translation as Language Modeling
Tucker Berckmann | Berkan Hiziroglu
Proceedings of the Fifth Conference on Machine Translation

We present our submission to the very low resource supervised machine translation task at WMT20. We use a decoder-only transformer architecture and formulate the translation task as language modeling. To address the low-resource aspect of the problem, we pretrain over a similar language parallel corpus. Then, we employ an intermediate back-translation step before fine-tuning. Finally, we present an analysis of the system’s performance.
Search
Co-authors
Venues