Discrete Opinion Tree Induction for Aspect-based Sentiment Analysis

Chenhua Chen, Zhiyang Teng, Zhongqing Wang, Yue Zhang


Abstract
Dependency trees have been intensively used with graph neural networks for aspect-based sentiment classification. Though being effective, such methods rely on external dependency parsers, which can be unavailable for low-resource languages or perform worse in low-resource domains. In addition, dependency trees are also not optimized for aspect-based sentiment classification. In this paper, we propose an aspect-specific and language-agnostic discrete latent opinion tree model as an alternative structure to explicit dependency trees. To ease the learning of complicated structured latent variables, we build a connection between aspect-to-context attention scores and syntactic distances, inducing trees from the attention scores. Results on six English benchmarks and one Chinese dataset show that our model can achieve competitive performance and interpretability.
Anthology ID:
2022.acl-long.145
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2051–2064
Language:
URL:
https://aclanthology.org/2022.acl-long.145
DOI:
10.18653/v1/2022.acl-long.145
Bibkey:
Cite (ACL):
Chenhua Chen, Zhiyang Teng, Zhongqing Wang, and Yue Zhang. 2022. Discrete Opinion Tree Induction for Aspect-based Sentiment Analysis. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2051–2064, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Discrete Opinion Tree Induction for Aspect-based Sentiment Analysis (Chen et al., ACL 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2022.acl-long.145.pdf
Software:
 2022.acl-long.145.software.zip
Code
 ccsoleil/dotgcn
Data
MAMS