Indicative Summarization of Long Discussions
Shahbaz Syed, Dominik Schwabe, Khalid Al-Khatib, Martin Potthast
Abstract
Online forums encourage the exchange and discussion of different stances on many topics. Not only do they provide an opportunity to present one’s own arguments, but may also gather a broad cross-section of others’ arguments. However, the resulting long discussions are difficult to overview. This paper presents a novel unsupervised approach using large language models (LLMs) to generating indicative summaries for long discussions that basically serve as tables of contents. Our approach first clusters argument sentences, generates cluster labels as abstractive summaries, and classifies the generated cluster labels into argumentation frames resulting in a two-level summary. Based on an extensively optimized prompt engineering approach, we evaluate 19 LLMs for generative cluster labeling and frame classification. To evaluate the usefulness of our indicative summaries, we conduct a purpose-driven user study via a new visual interface called **Discussion Explorer**: It shows that our proposed indicative summaries serve as a convenient navigation tool to explore long discussions.- Anthology ID:
- 2023.emnlp-main.166
- Volume:
- Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
- Month:
- December
- Year:
- 2023
- Address:
- Singapore
- Editors:
- Houda Bouamor, Juan Pino, Kalika Bali
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2752–2788
- Language:
- URL:
- https://aclanthology.org/2023.emnlp-main.166
- DOI:
- 10.18653/v1/2023.emnlp-main.166
- Cite (ACL):
- Shahbaz Syed, Dominik Schwabe, Khalid Al-Khatib, and Martin Potthast. 2023. Indicative Summarization of Long Discussions. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2752–2788, Singapore. Association for Computational Linguistics.
- Cite (Informal):
- Indicative Summarization of Long Discussions (Syed et al., EMNLP 2023)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-5/2023.emnlp-main.166.pdf