The ADAPT Centre’s Neural MT Systems for the WAT 2020 Document-Level Translation Task

Wandri Jooste, Rejwanul Haque, Andy Way


Abstract
In this paper we describe the ADAPT Centre’s submissions to the WAT 2020 document-level Business Scene Dialogue (BSD) Translation task. We only consider translating from Japanese to English for this task and we use the MarianNMT toolkit to train Transformer models. In order to improve the translation quality, we made use of both in-domain and out-of-domain data for training our Machine Translation (MT) systems, as well as various data augmentation techniques for fine-tuning the model parameters. This paper outlines the experiments we ran to train our systems and report the accuracy achieved through these various experiments.
Anthology ID:
2020.wat-1.17
Volume:
Proceedings of the 7th Workshop on Asian Translation
Month:
December
Year:
2020
Address:
Suzhou, China
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
142–146
Language:
URL:
https://aclanthology.org/2020.wat-1.17
DOI:
Bibkey:
Cite (ACL):
Wandri Jooste, Rejwanul Haque, and Andy Way. 2020. The ADAPT Centre’s Neural MT Systems for the WAT 2020 Document-Level Translation Task. In Proceedings of the 7th Workshop on Asian Translation, pages 142–146, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
The ADAPT Centre’s Neural MT Systems for the WAT 2020 Document-Level Translation Task (Jooste et al., WAT 2020)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2020.wat-1.17.pdf
Data
Business Scene DialogueJESCOpenSubtitles