@inproceedings{li-etal-2021-miss-wmt21,
    title = "{M}i{SS}@{WMT}21: Contrastive Learning-reinforced Domain Adaptation in Neural Machine Translation",
    author = "Li, Zuchao  and
      Utiyama, Masao  and
      Sumita, Eiichiro  and
      Zhao, Hai",
    editor = "Barrault, Loic  and
      Bojar, Ondrej  and
      Bougares, Fethi  and
      Chatterjee, Rajen  and
      Costa-jussa, Marta R.  and
      Federmann, Christian  and
      Fishel, Mark  and
      Fraser, Alexander  and
      Freitag, Markus  and
      Graham, Yvette  and
      Grundkiewicz, Roman  and
      Guzman, Paco  and
      Haddow, Barry  and
      Huck, Matthias  and
      Yepes, Antonio Jimeno  and
      Koehn, Philipp  and
      Kocmi, Tom  and
      Martins, Andre  and
      Morishita, Makoto  and
      Monz, Christof",
    booktitle = "Proceedings of the Sixth Conference on Machine Translation",
    month = nov,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.wmt-1.12/",
    pages = "154--161",
    abstract = "In this paper, we describe our MiSS system that participated in the WMT21 news translation task. We mainly participated in the evaluation of the three translation directions of English-Chinese and Japanese-English translation tasks. In the systems submitted, we primarily considered wider networks, deeper networks, relative positional encoding, and dynamic convolutional networks in terms of model structure, while in terms of training, we investigated contrastive learning-reinforced domain adaptation, self-supervised training, and optimization objective switching training methods. According to the final evaluation results, a deeper, wider, and stronger network can improve translation performance in general, yet our data domain adaption method can improve performance even more. In addition, we found that switching to the use of our proposed objective during the finetune phase using relatively small domain-related data can effectively improve the stability of the model{'}s convergence and achieve better optimal performance."
}