RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification
Junjie Ye, Jie Zhou, Junfeng Tian, Rui Wang, Qi Zhang, Tao Gui, Xuanjing Huang
Abstract
Recently, Target-oriented Multimodal Sentiment Classification (TMSC) has gained significant attention among scholars. However, current multimodal models have reached a performance bottleneck. To investigate the causes of this problem, we perform extensive empirical evaluation and in-depth analysis of the datasets to answer the following questions: **Q1**: Are the modalities equally important for TMSC? **Q2**: Which multimodal fusion modules are more effective? **Q3**: Do existing datasets adequately support the research? Our experiments and analyses reveal that the current TMSC systems primarily rely on the textual modality, as most of targets’ sentiments can be determined *solely* by text. Consequently, we point out several directions to work on for the TMSC task in terms of model design and dataset construction. The code and data can be found in https://github.com/Junjie-Ye/RethinkingTMSC.- Anthology ID:
- 2023.findings-emnlp.21
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2023
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 270–277
- Language:
- URL:
- https://aclanthology.org/2023.findings-emnlp.21
- DOI:
- 10.18653/v1/2023.findings-emnlp.21
- Cite (ACL):
- Junjie Ye, Jie Zhou, Junfeng Tian, Rui Wang, Qi Zhang, Tao Gui, and Xuanjing Huang. 2023. RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 270–277, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- RethinkingTMSC: An Empirical Study for Target-Oriented Multimodal Sentiment Classification (Ye et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2023.findings-emnlp.21.pdf