Sentiment Knowledge Enhanced Self-supervised Learning for Multimodal Sentiment Analysis

Fan Qian, Jiqing Han, Yongjun He, Tieran Zheng, Guibin Zheng


Abstract
Multimodal Sentiment Analysis (MSA) has made great progress that benefits from extraordinary fusion scheme. However, there is a lack of labeled data, resulting in severe overfitting and poor generalization for supervised models applied in this field. In this paper, we propose Sentiment Knowledge Enhanced Self-supervised Learning (SKESL) to capture common sentimental patterns in unlabeled videos, which facilitates further learning on limited labeled data. Specifically, with the help of sentiment knowledge and non-verbal behavior, SKESL conducts sentiment word masking and predicts fine-grained word sentiment intensity, so as to embed sentiment information at the word level into pre-trained multimodal representation. In addition, a non-verbal injection method is also proposed to integrate non-verbal information into the word semantics. Experiments on two standard benchmarks of MSA clearly show that SKESL significantly outperforms the baseline, and achieves new State-Of-The-Art (SOTA) results.
Anthology ID:
2023.findings-acl.821
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12966–12978
Language:
URL:
https://aclanthology.org/2023.findings-acl.821
DOI:
10.18653/v1/2023.findings-acl.821
Bibkey:
Cite (ACL):
Fan Qian, Jiqing Han, Yongjun He, Tieran Zheng, and Guibin Zheng. 2023. Sentiment Knowledge Enhanced Self-supervised Learning for Multimodal Sentiment Analysis. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12966–12978, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Sentiment Knowledge Enhanced Self-supervised Learning for Multimodal Sentiment Analysis (Qian et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-acl.821.pdf