@inproceedings{clinchant-etal-2019-use,
    title = "On the use of {BERT} for Neural Machine Translation",
    author = "Clinchant, Stephane  and
      Jung, Kweon Woo  and
      Nikoulina, Vassilina",
    editor = "Birch, Alexandra  and
      Finch, Andrew  and
      Hayashi, Hiroaki  and
      Konstas, Ioannis  and
      Luong, Thang  and
      Neubig, Graham  and
      Oda, Yusuke  and
      Sudoh, Katsuhito",
    booktitle = "Proceedings of the 3rd Workshop on Neural Generation and Translation",
    month = nov,
    year = "2019",
    address = "Hong Kong",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/D19-5611/",
    doi = "10.18653/v1/D19-5611",
    pages = "108--117",
    abstract = "Exploiting large pretrained models for various NMT tasks have gained a lot of visibility recently. In this work we study how BERT pretrained models could be exploited for supervised Neural Machine Translation. We compare various ways to integrate pretrained BERT model with NMT model and study the impact of the monolingual data used for BERT training on the final translation quality. We use WMT-14 English-German, IWSLT15 English-German and IWSLT14 English-Russian datasets for these experiments. In addition to standard task test set evaluation, we perform evaluation on out-of-domain test sets and noise injected test sets, in order to assess how BERT pretrained representations affect model robustness."
}Markdown (Informal)
[On the use of BERT for Neural Machine Translation](https://preview.aclanthology.org/iwcs-25-ingestion/D19-5611/) (Clinchant et al., NGT 2019)
ACL
- Stephane Clinchant, Kweon Woo Jung, and Vassilina Nikoulina. 2019. On the use of BERT for Neural Machine Translation. In Proceedings of the 3rd Workshop on Neural Generation and Translation, pages 108–117, Hong Kong. Association for Computational Linguistics.