@inproceedings{santra-etal-2021-hierarchical,
    title = "Hierarchical Transformer for Task Oriented Dialog Systems",
    author = "Santra, Bishal  and
      Anusha, Potnuru  and
      Goyal, Pawan",
    editor = "Toutanova, Kristina  and
      Rumshisky, Anna  and
      Zettlemoyer, Luke  and
      Hakkani-Tur, Dilek  and
      Beltagy, Iz  and
      Bethard, Steven  and
      Cotterell, Ryan  and
      Chakraborty, Tanmoy  and
      Zhou, Yichao",
    booktitle = "Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies",
    month = jun,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.449/",
    doi = "10.18653/v1/2021.naacl-main.449",
    pages = "5649--5658",
    abstract = "Generative models for dialog systems have gained much interest because of the recent success of RNN and Transformer based models in tasks like question answering and summarization. Although the task of dialog response generation is generally seen as a sequence to sequence (Seq2Seq) problem, researchers in the past have found it challenging to train dialog systems using the standard Seq2Seq models. Therefore, to help the model learn meaningful utterance and conversation level features, Sordoni et al. (2015b), Serban et al. (2016) proposed Hierarchical RNN architecture, which was later adopted by several other RNN based dialog systems. With the transformer-based models dominating the seq2seq problems lately, the natural question to ask is the applicability of the notion of hierarchy in transformer-based dialog systems. In this paper, we propose a generalized framework for Hierarchical Transformer Encoders and show how a standard transformer can be morphed into any hierarchical encoder, including HRED and HIBERT like models, by using specially designed attention masks and positional encodings. We demonstrate that Hierarchical Encoding helps achieve better natural language understanding of the contexts in transformer-based models for task-oriented dialog systems through a wide range of experiments."
}Markdown (Informal)
[Hierarchical Transformer for Task Oriented Dialog Systems](https://preview.aclanthology.org/ingest-emnlp/2021.naacl-main.449/) (Santra et al., NAACL 2021)
ACL
- Bishal Santra, Potnuru Anusha, and Pawan Goyal. 2021. Hierarchical Transformer for Task Oriented Dialog Systems. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5649–5658, Online. Association for Computational Linguistics.