@inproceedings{dabre-sumita-2019-nicts-supervised,
    title = "{NICT}{'}s Supervised Neural Machine Translation Systems for the {WMT}19 Translation Robustness Task",
    author = "Dabre, Raj  and
      Sumita, Eiichiro",
    editor = "Bojar, Ond{\v{r}}ej  and
      Chatterjee, Rajen  and
      Federmann, Christian  and
      Fishel, Mark  and
      Graham, Yvette  and
      Haddow, Barry  and
      Huck, Matthias  and
      Yepes, Antonio Jimeno  and
      Koehn, Philipp  and
      Martins, Andr{\'e}  and
      Monz, Christof  and
      Negri, Matteo  and
      N{\'e}v{\'e}ol, Aur{\'e}lie  and
      Neves, Mariana  and
      Post, Matt  and
      Turchi, Marco  and
      Verspoor, Karin",
    booktitle = "Proceedings of the Fourth Conference on Machine Translation (Volume 2: Shared Task Papers, Day 1)",
    month = aug,
    year = "2019",
    address = "Florence, Italy",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/W19-5362/",
    doi = "10.18653/v1/W19-5362",
    pages = "533--536",
    abstract = "In this paper we describe our neural machine translation (NMT) systems for Japanese{\ensuremath{\leftrightarrow}}English translation which we submitted to the translation robustness task. We focused on leveraging transfer learning via fine tuning to improve translation quality. We used a fairly well established domain adaptation technique called Mixed Fine Tuning (MFT) (Chu et. al., 2017) to improve translation quality for Japanese{\ensuremath{\leftrightarrow}}English. We also trained bi-directional NMT models instead of uni-directional ones as the former are known to be quite robust, especially in low-resource scenarios. However, given the noisy nature of the in-domain training data, the improvements we obtained are rather modest."
}Markdown (Informal)
[NICT’s Supervised Neural Machine Translation Systems for the WMT19 Translation Robustness Task](https://preview.aclanthology.org/iwcs-25-ingestion/W19-5362/) (Dabre & Sumita, WMT 2019)
ACL