Aspect-aware Unsupervised Extractive Opinion Summarization

Haoyuan Li, Somnath Basu Roy Chowdhury, Snigdha Chaturvedi


Abstract
Extractive opinion summarization extracts sentences from users’ reviews to represent the prevalent opinions about a product or service. However, the extracted sentences can be redundant and may miss some important aspects, especially for centroid-based extractive summarization models (Radev et al., 2004). To alleviate these issues, we introduce TokenCluster– a method for unsupervised extractive opinion summarization that automatically identifies the aspects described in the review sentences and then extracts sentences based on their aspects. It identifies the underlying aspects of the review sentences using roots of noun phrases and adjectives appearing in them. Empirical evaluation shows that TokenCluster improves aspect coverage in summaries and achieves strong performance on multiple opinion summarization datasets, for both general and aspect-specific summarization. We also perform extensive ablation and human evaluation studies to validate the design choices of our method. The implementation of our work is available at https://github.com/leehaoyuan/TokenCluster
Anthology ID:
2023.findings-acl.802
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12662–12678
Language:
URL:
https://aclanthology.org/2023.findings-acl.802
DOI:
10.18653/v1/2023.findings-acl.802
Bibkey:
Cite (ACL):
Haoyuan Li, Somnath Basu Roy Chowdhury, and Snigdha Chaturvedi. 2023. Aspect-aware Unsupervised Extractive Opinion Summarization. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12662–12678, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Aspect-aware Unsupervised Extractive Opinion Summarization (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/improve-issue-templates/2023.findings-acl.802.pdf