@inproceedings{libovicky-helcl-2018-end,
    title = "End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification",
    author = "Libovick{\'y}, Jind{\v{r}}ich  and
      Helcl, Jind{\v{r}}ich",
    editor = "Riloff, Ellen  and
      Chiang, David  and
      Hockenmaier, Julia  and
      Tsujii, Jun{'}ichi",
    booktitle = "Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
    month = oct # "-" # nov,
    year = "2018",
    address = "Brussels, Belgium",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/D18-1336/",
    doi = "10.18653/v1/D18-1336",
    pages = "3016--3021",
    abstract = "Autoregressive decoding is the only part of sequence-to-sequence models that prevents them from massive parallelization at inference time. Non-autoregressive models enable the decoder to generate all output symbols independently in parallel. We present a novel non-autoregressive architecture based on connectionist temporal classification and evaluate it on the task of neural machine translation. Unlike other non-autoregressive methods which operate in several steps, our model can be trained end-to-end. We conduct experiments on the WMT English-Romanian and English-German datasets. Our models achieve a significant speedup over the autoregressive models, keeping the translation quality comparable to other non-autoregressive models."
}Markdown (Informal)
[End-to-End Non-Autoregressive Neural Machine Translation with Connectionist Temporal Classification](https://preview.aclanthology.org/ingest-emnlp/D18-1336/) (Libovický & Helcl, EMNLP 2018)
ACL