COGMEN: COntextualized GNN based Multimodal Emotion recognitioN

Abhinav Joshi, Ashwani Bhat, Ayush Jain, Atin Singh, Ashutosh Modi


Abstract
Emotions are an inherent part of human interactions, and consequently, it is imperative to develop AI systems that understand and recognize human emotions. During a conversation involving various people, a person’s emotions are influenced by the other speaker’s utterances and their own emotional state over the utterances. In this paper, we propose COntextualized Graph Neural Network based Multi- modal Emotion recognitioN (COGMEN) system that leverages local information (i.e., inter/intra dependency between speakers) and global information (context). The proposed model uses Graph Neural Network (GNN) based architecture to model the complex dependencies (local and global information) in a conversation. Our model gives state-of-the- art (SOTA) results on IEMOCAP and MOSEI datasets, and detailed ablation experiments show the importance of modeling information at both levels.
Anthology ID:
2022.naacl-main.306
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4148–4164
Language:
URL:
https://aclanthology.org/2022.naacl-main.306
DOI:
10.18653/v1/2022.naacl-main.306
Bibkey:
Cite (ACL):
Abhinav Joshi, Ashwani Bhat, Ayush Jain, Atin Singh, and Ashutosh Modi. 2022. COGMEN: COntextualized GNN based Multimodal Emotion recognitioN. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4148–4164, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
COGMEN: COntextualized GNN based Multimodal Emotion recognitioN (Joshi et al., NAACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.306.pdf
Software:
 2022.naacl-main.306.software.zip
Video:
 https://preview.aclanthology.org/emnlp-22-attachments/2022.naacl-main.306.mp4
Code
 exploration-lab/cogmen
Data
CMU-MOSEIIEMOCAP