Incremental Summarization for Customer Support via Progressive Note-Taking and Agent Feedback

Yisha Wu, Cen Zhao, Yuanpei Cao, Xiaoqing Xu, Yashar Mehdad, Mindy Ji, Claire Na Cheng


Abstract
We introduce an incremental summarization system for customer support agents that intelligently determines when to generate concise bullet notes during conversations, reducing agents’ cognitive load and redundant review. Our approach combines a fine-tuned Mixtral-8×7B model for continuous note generation with a DeBERTa-based classifier to filter trivial content. Agent edits refine the online notes generation and regularly inform offline model retraining, closing the agent edits feedback loop. Deployed in production, our system achieved a 3% reduction in case handling time compared to bulk summarization (with reductions of up to 9% in highly complex cases), alongside high agent satisfaction ratings from surveys. These results demonstrate that incremental summarization with continuous feedback effectively enhances summary quality and agent productivity at scale.
Anthology ID:
2025.emnlp-industry.140
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
Month:
November
Year:
2025
Address:
Suzhou (China)
Editors:
Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2000–2015
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.140/
DOI:
Bibkey:
Cite (ACL):
Yisha Wu, Cen Zhao, Yuanpei Cao, Xiaoqing Xu, Yashar Mehdad, Mindy Ji, and Claire Na Cheng. 2025. Incremental Summarization for Customer Support via Progressive Note-Taking and Agent Feedback. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 2000–2015, Suzhou (China). Association for Computational Linguistics.
Cite (Informal):
Incremental Summarization for Customer Support via Progressive Note-Taking and Agent Feedback (Wu et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.140.pdf