@inproceedings{bhosale-etal-2020-language,
    title = "Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling",
    author = "Bhosale, Shruti  and
      Yee, Kyra  and
      Edunov, Sergey  and
      Auli, Michael",
    editor = {Barrault, Lo{\"i}c  and
      Bojar, Ond{\v{r}}ej  and
      Bougares, Fethi  and
      Chatterjee, Rajen  and
      Costa-juss{\`a}, Marta R.  and
      Federmann, Christian  and
      Fishel, Mark  and
      Fraser, Alexander  and
      Graham, Yvette  and
      Guzman, Paco  and
      Haddow, Barry  and
      Huck, Matthias  and
      Yepes, Antonio Jimeno  and
      Koehn, Philipp  and
      Martins, Andr{\'e}  and
      Morishita, Makoto  and
      Monz, Christof  and
      Nagata, Masaaki  and
      Nakazawa, Toshiaki  and
      Negri, Matteo},
    booktitle = "Proceedings of the Fifth Conference on Machine Translation",
    month = nov,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.wmt-1.69/",
    pages = "584--593",
    abstract = "Pre-training models on vast quantities of unlabeled data has emerged as an effective approach to improving accuracy on many NLP tasks. On the other hand, traditional machine translation has a long history of leveraging unlabeled data through noisy channel modeling. The same idea has recently been shown to achieve strong improvements for neural machine translation. Unfortunately, na {\ensuremath{\ddot{}}}{\i}ve noisy channel modeling with modern sequence to sequence models is up to an order of magnitude slower than alternatives. We address this issue by introducing efficient approximations to make inference with the noisy channel approach as fast as strong ensembles while increasing accuracy. We also show that the noisy channel approach can outperform strong pre-training results by achieving a new state of the art on WMT Romanian-English translation."
}Markdown (Informal)
[Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling](https://preview.aclanthology.org/ingest-emnlp/2020.wmt-1.69/) (Bhosale et al., WMT 2020)
ACL