Improving Aspect-Based Summarization via Contrastive Learning with Anchored Negative Examples

Elizabeth Palmieri, Yangfeng Ji


Abstract
Text summarization helps users manage information overload, but traditional methods can be cumbersome when seeking specific details within a document. Aspect-based text summarization addresses this by using a query to guide which information should be summarized. However, distinguishing relevant from irrelevant information for a given aspect remains challenging in LLM-based summarization models. In this work, we propose utilizing contrastive learning to encourage LLMs to focus on aspect-related signals during training. We further design two variants of the learning algorithm, aspect-anchored and summary-anchored, corresponding to the strategies used in constructing negative examples. Evaluation with two representative LLM families (Llama 2 and Pythia) and two benchmark datasets (AnyAspect and CovidET) demonstrates the proposed methods’ strong performance compared to their supervised fine-tuning and zero-shot counterparts, highlighting contrastive learning as a promising direction for aspect-based text summarization.
Anthology ID:
2025.newsum-main.5
Volume:
Proceedings of The 5th New Frontiers in Summarization Workshop
Month:
November
Year:
2025
Address:
Hybrid
Editors:
Yue Dong, Wen Xiao, Haopeng Zhang, Rui Zhang, Ori Ernst, Lu Wang, Fei Liu
Venues:
NewSum | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
59–73
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.5/
DOI:
Bibkey:
Cite (ACL):
Elizabeth Palmieri and Yangfeng Ji. 2025. Improving Aspect-Based Summarization via Contrastive Learning with Anchored Negative Examples. In Proceedings of The 5th New Frontiers in Summarization Workshop, pages 59–73, Hybrid. Association for Computational Linguistics.
Cite (Informal):
Improving Aspect-Based Summarization via Contrastive Learning with Anchored Negative Examples (Palmieri & Ji, NewSum 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.5.pdf