@inproceedings{lin-etal-2020-co,
    title = "A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown Detection",
    author = "Lin, Qian  and
      Kundu, Souvik  and
      Ng, Hwee Tou",
    editor = "Scott, Donia  and
      Bel, Nuria  and
      Zong, Chengqing",
    booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain (Online)",
    publisher = "International Committee on Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.371/",
    doi = "10.18653/v1/2020.coling-main.371",
    pages = "4201--4210",
    abstract = "Ensuring smooth communication is essential in a chat-oriented dialogue system, so that a user can obtain meaningful responses through interactions with the system. Most prior work on dialogue research does not focus on preventing dialogue breakdown. One of the major challenges is that a dialogue system may generate an undesired utterance leading to a dialogue breakdown, which degrades the overall interaction quality. Hence, it is crucial for a machine to detect dialogue breakdowns in an ongoing conversation. In this paper, we propose a novel dialogue breakdown detection model that jointly incorporates a pretrained cross-lingual language model and a co-attention network. Our proposed model leverages effective word embeddings trained on one hundred different languages to generate contextualized representations. Co-attention aims to capture the interaction between the latest utterance and the conversation history, and thereby determines whether the latest utterance causes a dialogue breakdown. Experimental results show that our proposed model outperforms all previous approaches on all evaluation metrics in both the Japanese and English tracks in Dialogue Breakdown Detection Challenge 4 (DBDC4 at IWSDS2019)."
}Markdown (Informal)
[A Co-Attentive Cross-Lingual Neural Model for Dialogue Breakdown Detection](https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.371/) (Lin et al., COLING 2020)
ACL