Exploring Graph Pre-training for Aspect-based Sentiment Analysis

Xiaoyi Bao, Zhongqing Wang, Guodong Zhou


Abstract
Existing studies tend to extract the sentiment elements in a generative manner in order to avoid complex modeling. Despite their effectiveness, they ignore importance of the relationships between sentiment elements that could be crucial, making the large pre-trained generative models sub-optimal for modeling sentiment knowledge. Therefore, we introduce two pre-training paradigms to improve the generation model by exploring graph pre-training that targeting to strengthen the model in capturing the elements’ relationships. Specifically, We first employ an Element-level Graph Pre-training paradigm, which is designed to improve the structure awareness of the generative model. Then, we design a Task Decomposition Pre-training paradigm to make the generative model generalizable and robust against various irregular sentiment quadruples. Extensive experiments show the superiority of our proposed method, validate the correctness of our motivation.
Anthology ID:
2023.findings-emnlp.234
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3623–3634
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.234
DOI:
10.18653/v1/2023.findings-emnlp.234
Bibkey:
Cite (ACL):
Xiaoyi Bao, Zhongqing Wang, and Guodong Zhou. 2023. Exploring Graph Pre-training for Aspect-based Sentiment Analysis. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3623–3634, Singapore. Association for Computational Linguistics.
Cite (Informal):
Exploring Graph Pre-training for Aspect-based Sentiment Analysis (Bao et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-5/2023.findings-emnlp.234.pdf