To be Closer: Learning to Link up Aspects with Opinions

Yuxiang Zhou, Lejian Liao, Yang Gao, Zhanming Jie, Wei Lu


Abstract
Dependency parse trees are helpful for discovering the opinion words in aspect-based sentiment analysis (ABSA) (CITATION). However, the trees obtained from off-the-shelf dependency parsers are static, and could be sub-optimal in ABSA. This is because the syntactic trees are not designed for capturing the interactions between opinion words and aspect words. In this work, we aim to shorten the distance between aspects and corresponding opinion words by learning an aspect-centric tree structure. The aspect and opinion words are expected to be closer along such tree structure compared to the standard dependency parse tree. The learning process allows the tree structure to adaptively correlate the aspect and opinion words, enabling us to better identify the polarity in the ABSA task. We conduct experiments on five aspect-based sentiment datasets, and the proposed model significantly outperforms recent strong baselines. Furthermore, our thorough analysis demonstrates the average distance between aspect and opinion words are shortened by at least 19% on the standard SemEval Restaurant14 (CITATION) dataset.
Anthology ID:
2021.emnlp-main.317
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3899–3909
Language:
URL:
https://aclanthology.org/2021.emnlp-main.317
DOI:
10.18653/v1/2021.emnlp-main.317
Bibkey:
Cite (ACL):
Yuxiang Zhou, Lejian Liao, Yang Gao, Zhanming Jie, and Wei Lu. 2021. To be Closer: Learning to Link up Aspects with Opinions. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 3899–3909, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
To be Closer: Learning to Link up Aspects with Opinions (Zhou et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.317.pdf
Software:
 2021.emnlp-main.317.Software.zip
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.317.mp4
Code
 zyxnlp/aclt