DACP: Domain-Adaptive Continual Pre-Training of Large Language Models for Phone Conversation Summarization
Xue-Yong Fu, Elena Khasanova, Md Tahmid Rahman Laskar, Harsh Saini, Shashi Bhushan Tn
Abstract
Large language models (LLMs) have achieved impressive performance in text summarization, yet their performance often falls short when applied to specialized domains that differ from their original pre-training distribution. While fine-tuning can improve summarization quality, it typically relies on costly and scarce high-quality labeled data. In this work, we explore continual pre-training as a scalable, self-supervised approach to adapt LLMs for downstream summarization tasks, particularly in the context of noisy real-world conversation transcripts. We conduct extensive experiments using large-scale, unlabeled business conversation data to investigate whether continual pre-training enhances model capabilities in conversational summarization. Our results demonstrate that continual pre-training yields substantial gains in both in-domain and out-of-domain summarization benchmarks, while maintaining strong generalization and robustness. We also analyze the effects of data selection strategies, providing practical guidelines for applying continual pre-training in summarization-focused industrial applications.- Anthology ID:
- 2025.newsum-main.7
- Volume:
- Proceedings of The 5th New Frontiers in Summarization Workshop
- Month:
- November
- Year:
- 2025
- Address:
- Hybrid
- Editors:
- Yue Dong, Wen Xiao, Haopeng Zhang, Rui Zhang, Ori Ernst, Lu Wang, Fei Liu
- Venues:
- NewSum | WS
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 94–101
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.7/
- DOI:
- Cite (ACL):
- Xue-Yong Fu, Elena Khasanova, Md Tahmid Rahman Laskar, Harsh Saini, and Shashi Bhushan Tn. 2025. DACP: Domain-Adaptive Continual Pre-Training of Large Language Models for Phone Conversation Summarization. In Proceedings of The 5th New Frontiers in Summarization Workshop, pages 94–101, Hybrid. Association for Computational Linguistics.
- Cite (Informal):
- DACP: Domain-Adaptive Continual Pre-Training of Large Language Models for Phone Conversation Summarization (Fu et al., NewSum 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.newsum-main.7.pdf