Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training

Zhengyan Li, Yicheng Zou, Chong Zhang, Qi Zhang, Zhongyu Wei


Abstract
Aspect-based sentiment analysis aims to identify the sentiment polarity of a specific aspect in product reviews. We notice that about 30% of reviews do not contain obvious opinion words, but still convey clear human-aware sentiment orientation, which is known as implicit sentiment. However, recent neural network-based approaches paid little attention to implicit sentiment entailed in the reviews. To overcome this issue, we adopt Supervised Contrastive Pre-training on large-scale sentiment-annotated corpora retrieved from in-domain language resources. By aligning the representation of implicit sentiment expressions to those with the same sentiment label, the pre-training process leads to better capture of both implicit and explicit sentiment orientation towards aspects in reviews. Experimental results show that our method achieves state-of-the-art performance on SemEval2014 benchmarks, and comprehensive analysis validates its effectiveness on learning implicit sentiment.
Anthology ID:
2021.emnlp-main.22
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
246–256
Language:
URL:
https://aclanthology.org/2021.emnlp-main.22
DOI:
10.18653/v1/2021.emnlp-main.22
Bibkey:
Cite (ACL):
Zhengyan Li, Yicheng Zou, Chong Zhang, Qi Zhang, and Zhongyu Wei. 2021. Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 246–256, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learning Implicit Sentiment in Aspect-based Sentiment Analysis with Supervised Contrastive Pre-Training (Li et al., EMNLP 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.22.pdf
Video:
 https://preview.aclanthology.org/ingestion-script-update/2021.emnlp-main.22.mp4
Code
 tribleave/scapt-absa
Data
MAMS