KESA: A Knowledge Enhanced Approach To Sentiment Analysis

Qinghua Zhao, Shuai Ma, Shuo Ren


Abstract
Though some recent works focus on injecting sentiment knowledge into pre-trained language models, they usually design mask and reconstruction tasks in the post-training phase. This paper aims to integrate sentiment knowledge in the fine-tuning stage. To achieve this goal, we propose two sentiment-aware auxiliary tasks named sentiment word selection and conditional sentiment prediction and, correspondingly, integrate them into the objective of the downstream task. The first task learns to select the correct sentiment words from the given options. The second task predicts the overall sentiment polarity, with the sentiment polarity of the word given as prior knowledge. In addition, two label combination methods are investigated to unify multiple types of labels in each auxiliary task. Experimental results demonstrate that our approach consistently outperforms baselines (achieving a new state-of-the-art) and is complementary to existing sentiment-enhanced post-trained models.
Anthology ID:
2022.aacl-main.58
Volume:
Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2022
Address:
Online only
Venues:
AACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
766–776
Language:
URL:
https://aclanthology.org/2022.aacl-main.58
DOI:
Bibkey:
Cite (ACL):
Qinghua Zhao, Shuai Ma, and Shuo Ren. 2022. KESA: A Knowledge Enhanced Approach To Sentiment Analysis. In Proceedings of the 2nd Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 12th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 766–776, Online only. Association for Computational Linguistics.
Cite (Informal):
KESA: A Knowledge Enhanced Approach To Sentiment Analysis (Zhao et al., AACL-IJCNLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/auto-file-uploads/2022.aacl-main.58.pdf