Hierarchical Attention Adapter for Abstractive Dialogue Summarization

Raymond Li, Chuyuan Li, Gabriel Murray, Giuseppe Carenini


Abstract
Dialogue summarization is still a very challenging task even for large language models (LLMs). On the one hand, some previous approaches have pre-trained language models specifically for dialogue understanding and summarization, but they have been limited to relatively small models. On the other hand, other works have tried to directly exploit the dialogue semantics and discourse structures in their modeling effort, but by construction, they require access to those structures, which is in itself a largely unsolved problem. In this paper, we synergistically combine these two ideas in an approach that can be seamlessly integrated into the decoder-only architecture adopted by the most state-of-the-art LLMs. In particular, our novel solution leverages the parameter-efficient fine-tuning (PEFT) paradigm to model the hierarchical structure of dialogues, where input sequences are naturally segmented into dialogue turns, and then fine-tune the model for abstractive summarization. From experiments on two datasets, we find that Hierarchical Attention Adapter outperforms all baseline adapter methods on SummScreen, where our approach can also be combined with LoRA to achieve the best performance on SamSum.
Anthology ID:
2025.newsum-main.2
Volume:
Proceedings of The 5th New Frontiers in Summarization Workshop
Month:
November
Year:
2025
Address:
Hybrid
Editors:
Yue Dong, Wen Xiao, Haopeng Zhang, Rui Zhang, Ori Ernst, Lu Wang, Fei Liu
Venues:
NewSum | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17–30
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.2/
DOI:
Bibkey:
Cite (ACL):
Raymond Li, Chuyuan Li, Gabriel Murray, and Giuseppe Carenini. 2025. Hierarchical Attention Adapter for Abstractive Dialogue Summarization. In Proceedings of The 5th New Frontiers in Summarization Workshop, pages 17–30, Hybrid. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Attention Adapter for Abstractive Dialogue Summarization (Li et al., NewSum 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.2.pdf