@inproceedings{saffar-mehrjardi-etal-2019-self,
    title = "Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems",
    author = "Saffar Mehrjardi, Mansour  and
      Trabelsi, Amine  and
      Zaiane, Osmar R.",
    editor = "Mitkov, Ruslan  and
      Angelova, Galia",
    booktitle = "Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)",
    month = sep,
    year = "2019",
    address = "Varna, Bulgaria",
    publisher = "INCOMA Ltd.",
    url = "https://preview.aclanthology.org/iwcs-25-ingestion/R19-1119/",
    doi = "10.26615/978-954-452-056-4_119",
    pages = "1031--1040",
    abstract = "Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrence-based and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism. Self-attentional models have been used in the creation of the state-of-the-art models in many NLP task such as neural machine translation, but their usage has not been explored for the task of training end-to-end task-oriented dialogue generation systems yet. In this study, we apply these models on the DSTC2 dataset for training task-oriented chatbots. Our finding shows that self-attentional models can be exploited to create end-to-end task-oriented chatbots which not only achieve higher evaluation scores compared to recurrence-based models, but also do so more efficiently."
}Markdown (Informal)
[Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems](https://preview.aclanthology.org/iwcs-25-ingestion/R19-1119/) (Saffar Mehrjardi et al., RANLP 2019)
ACL