Bridging Information Gaps with Comprehensive Answers: Improving the Diversity and Informativeness of Follow-Up Questions

Zhe Liu, Taekyu Kang, Haoyu Wang, Seyed Hossein Alavi, Vered Shwartz


Abstract
Generating diverse follow-up questions that uncover missing information remains challenging for conversational agents, particularly when they run on small, locally hosted models. To tackle this problem, we develop an information-gap–driven pipeline that contrasts the initial answer with an LLM-generated comprehensive answer, identifies the information gaps, and formulates gap-bridging follow-up questions. Applying the pipeline, we augment an existing dataset–FollowupQG–tenfold. Experiments show that models fine-tuned on the augmented dataset achieve significantly higher informativeness and diversity than variations trained on the original dataset. These findings indicate that our pipeline, which mirrors the human cognitive process of information seeking, provides an efficient distillation channel from state-of-the-art LLMs to smaller models, enabling resource-constrained conversational systems to generate more diverse and informative follow-up questions.
Anthology ID:
2025.starsem-1.2
Volume:
Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025)
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Lea Frermann, Mark Stevenson
Venue:
*SEM
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13–30
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.2/
DOI:
Bibkey:
Cite (ACL):
Zhe Liu, Taekyu Kang, Haoyu Wang, Seyed Hossein Alavi, and Vered Shwartz. 2025. Bridging Information Gaps with Comprehensive Answers: Improving the Diversity and Informativeness of Follow-Up Questions. In Proceedings of the 14th Joint Conference on Lexical and Computational Semantics (*SEM 2025), pages 13–30, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Bridging Information Gaps with Comprehensive Answers: Improving the Diversity and Informativeness of Follow-Up Questions (Liu et al., *SEM 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.starsem-1.2.pdf