Enhancing Aspect-level Sentiment Analysis with Word Dependencies

Yuanhe Tian, Guimin Chen, Yan Song


Abstract
Aspect-level sentiment analysis (ASA) has received much attention in recent years. Most existing approaches tried to leverage syntactic information, such as the dependency parsing results of the input text, to improve sentiment analysis on different aspects. Although these approaches achieved satisfying results, their main focus is to leverage the dependency arcs among words where the dependency type information is omitted; and they model different dependencies equally where the noisy dependency results may hurt model performance. In this paper, we propose an approach to enhance aspect-level sentiment analysis with word dependencies, where the type information is modeled by key-value memory networks and different dependency results are selectively leveraged. Experimental results on five benchmark datasets demonstrate the effectiveness of our approach, where it outperforms baseline models on all datasets and achieves state-of-the-art performance on three of them.
Anthology ID:
2021.eacl-main.326
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3726–3739
Language:
URL:
https://aclanthology.org/2021.eacl-main.326
DOI:
10.18653/v1/2021.eacl-main.326
Bibkey:
Cite (ACL):
Yuanhe Tian, Guimin Chen, and Yan Song. 2021. Enhancing Aspect-level Sentiment Analysis with Word Dependencies. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 3726–3739, Online. Association for Computational Linguistics.
Cite (Informal):
Enhancing Aspect-level Sentiment Analysis with Word Dependencies (Tian et al., EACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.eacl-main.326.pdf
Code
 cuhksz-nlp/asa-wd