Stars Are All You Need: A Distantly Supervised Pyramid Network for Unified Sentiment Analysis
Wenchang Li, Yixing Chen, Shuang Zheng, Lei Wang, John Lalor
Abstract
Data for the Rating Prediction (RP) sentiment analysis task such as star reviews are readily available. However, data for aspect-category sentiment analysis (ACSA) is often desired because of the fine-grained nature but are expensive to collect. In this work we present a method for learning ACSA using only RP labels. We propose Unified Sentiment Analysis (Uni-SA) to efficiently understand aspect and review sentiment in a unified manner. We propose a Distantly Supervised Pyramid Network (DSPN) to efficiently perform Aspect-Category Detection (ACD), ACSA, and OSA using only RP labels for training. We evaluate DSPN on multi-aspect review datasets in English and Chinese and find that with only star rating labels for supervision, DSPN performs comparably well to a variety of benchmark models. We also demonstrate the interpretability of DSPN’s outputs on reviews to show the pyramid structure inherent in document level end-to-end sentiment analysis.- Anthology ID:
- 2024.wnut-1.10
- Volume:
- Proceedings of the Ninth Workshop on Noisy and User-generated Text (W-NUT 2024)
- Month:
- March
- Year:
- 2024
- Address:
- San Ġiljan, Malta
- Editors:
- Rob van der Goot, JinYeong Bak, Max Müller-Eberstein, Wei Xu, Alan Ritter, Tim Baldwin
- Venues:
- WNUT | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 104–118
- Language:
- URL:
- https://aclanthology.org/2024.wnut-1.10
- DOI:
- Cite (ACL):
- Wenchang Li, Yixing Chen, Shuang Zheng, Lei Wang, and John Lalor. 2024. Stars Are All You Need: A Distantly Supervised Pyramid Network for Unified Sentiment Analysis. In Proceedings of the Ninth Workshop on Noisy and User-generated Text (W-NUT 2024), pages 104–118, San Ġiljan, Malta. Association for Computational Linguistics.
- Cite (Informal):
- Stars Are All You Need: A Distantly Supervised Pyramid Network for Unified Sentiment Analysis (Li et al., WNUT-WS 2024)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-2/2024.wnut-1.10.pdf