IRSum: One Model to Rule Summarization and Retrieval

Sotaro Takeshita, Simone Paolo Ponzetto, Kai Eckert


Abstract
Applications that store a large number of documents often have summarization and retrieval functionalities to help users digest large amounts of information efficiently. Currently, such systems need to run two task-specific models, for summarization and retrieval, redundantly on the same set of documents. An efficient approach to amend this redundancy would be to reuse hidden representations produced during the summary generation for retrieval. However, our experiment shows that existing models, including recent large language models, do not produce retrieval-friendly embeddings during summarization due to a lack of a contrastive objective during their training. To this end, we introduce a simple, cost-effective training strategy which integrates a contrastive objective into standard summarization training without requiring additional annotations. We empirically show that our model can perform on par or even outperform in some cases compared to the combination of two task-specific models while improving throughput and FLOPs by up to 17% and 20%, respectively.
Anthology ID:
2025.gem-1.23
Volume:
Proceedings of the Fourth Workshop on Generation, Evaluation and Metrics (GEM²)
Month:
July
Year:
2025
Address:
Vienna, Austria and virtual meeting
Editors:
Kaustubh Dhole, Miruna Clinciu
Venues:
GEM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
262–275
Language:
URL:
https://preview.aclanthology.org/corrections-2025-08/2025.gem-1.23/
DOI:
Bibkey:
Cite (ACL):
Sotaro Takeshita, Simone Paolo Ponzetto, and Kai Eckert. 2025. IRSum: One Model to Rule Summarization and Retrieval. In Proceedings of the Fourth Workshop on Generation, Evaluation and Metrics (GEM²), pages 262–275, Vienna, Austria and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
IRSum: One Model to Rule Summarization and Retrieval (Takeshita et al., GEM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/corrections-2025-08/2025.gem-1.23.pdf