Syntax-Aware Graph Attention Network for Aspect-Level Sentiment Classification
Lianzhe Huang, Xin Sun, Sujian Li, Linhao Zhang, Houfeng Wang
Abstract
Aspect-level sentiment classification aims to distinguish the sentiment polarities over aspect terms in a sentence. Existing approaches mostly focus on modeling the relationship between the given aspect words and their contexts with attention, and ignore the use of more elaborate knowledge implicit in the context. In this paper, we exploit syntactic awareness to the model by the graph attention network on the dependency tree structure and external pre-training knowledge by BERT language model, which helps to model the interaction between the context and aspect words better. And the subwords of BERT are integrated into the dependency tree graphs, which can obtain more accurate representations of words by graph attention. Experiments demonstrate the effectiveness of our model.- Anthology ID:
- 2020.coling-main.69
- Volume:
- Proceedings of the 28th International Conference on Computational Linguistics
- Month:
- December
- Year:
- 2020
- Address:
- Barcelona, Spain (Online)
- Editors:
- Donia Scott, Nuria Bel, Chengqing Zong
- Venue:
- COLING
- SIG:
- Publisher:
- International Committee on Computational Linguistics
- Note:
- Pages:
- 799–810
- Language:
- URL:
- https://aclanthology.org/2020.coling-main.69
- DOI:
- 10.18653/v1/2020.coling-main.69
- Cite (ACL):
- Lianzhe Huang, Xin Sun, Sujian Li, Linhao Zhang, and Houfeng Wang. 2020. Syntax-Aware Graph Attention Network for Aspect-Level Sentiment Classification. In Proceedings of the 28th International Conference on Computational Linguistics, pages 799–810, Barcelona, Spain (Online). International Committee on Computational Linguistics.
- Cite (Informal):
- Syntax-Aware Graph Attention Network for Aspect-Level Sentiment Classification (Huang et al., COLING 2020)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-4/2020.coling-main.69.pdf