@inproceedings{zhang-etal-2020-knowledge,
    title = "Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer",
    author = "Zhang, Duzhen  and
      Chen, Xiuyi  and
      Xu, Shuang  and
      Xu, Bo",
    editor = "Scott, Donia  and
      Bel, Nuria  and
      Zong, Chengqing",
    booktitle = "Proceedings of the 28th International Conference on Computational Linguistics",
    month = dec,
    year = "2020",
    address = "Barcelona, Spain (Online)",
    publisher = "International Committee on Computational Linguistics",
    url = "https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.392/",
    doi = "10.18653/v1/2020.coling-main.392",
    pages = "4429--4440",
    abstract = "Emotion recognition in textual conversations (ERTC) plays an important role in a wide range of applications, such as opinion mining, recommender systems, and so on. ERTC, however, is a challenging task. For one thing, speakers often rely on the context and commonsense knowledge to express emotions; for another, most utterances contain neutral emotion in conversations, as a result, the confusion between a few non-neutral utterances and much more neutral ones restrains the emotion recognition performance. In this paper, we propose a novel Knowledge Aware Incremental Transformer with Multi-task Learning (KAITML) to address these challenges. Firstly, we devise a dual-level graph attention mechanism to leverage commonsense knowledge, which augments the semantic information of the utterance. Then we apply the Incremental Transformer to encode multi-turn contextual utterances. Moreover, we are the first to introduce multi-task learning to alleviate the aforementioned confusion and thus further improve the emotion recognition performance. Extensive experimental results show that our KAITML model outperforms the state-of-the-art models across five benchmark datasets."
}Markdown (Informal)
[Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer](https://preview.aclanthology.org/ingest-emnlp/2020.coling-main.392/) (Zhang et al., COLING 2020)
ACL