Comparative Opinion Summarization via Collaborative Decoding

Hayate Iso, Xiaolan Wang, Stefanos Angelidis, Yoshihiko Suhara


Abstract
Opinion summarization focuses on generating summaries that reflect popular subjective information expressed in multiple online reviews. While generated summaries offer general and concise information about a particular hotel or product, the information may be insufficient to help the user compare multiple different choices. Thus, the user may still struggle with the question “Which one should I pick?” In this paper, we propose the comparative opinion summarization task, which aims at generating two contrastive summaries and one common summary from two different candidate sets of reviews. We develop a comparative summarization framework CoCoSum, which consists of two base summarization models that jointly generate contrastive and common summaries. Experimental results on a newly created benchmark CoCoTrip show that CoCoSum can produce higher-quality contrastive and common summaries than state-of-the-art opinion summarization models. The dataset and code are available at https://github.com/megagonlabs/cocosum
Anthology ID:
2022.findings-acl.261
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3307–3324
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.findings-acl.261/
DOI:
10.18653/v1/2022.findings-acl.261
Bibkey:
Cite (ACL):
Hayate Iso, Xiaolan Wang, Stefanos Angelidis, and Yoshihiko Suhara. 2022. Comparative Opinion Summarization via Collaborative Decoding. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3307–3324, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Comparative Opinion Summarization via Collaborative Decoding (Iso et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2022.findings-acl.261.pdf
Code
 megagonlabs/cocosum