FQ-Eval: Building Evaluation Dataset for User-centered Follow-up Question Generation
Sanghyun Seo, Bumsoo Kang, Dahm Lee, Jaeheon Kim, Joongbo Shin, Eui Soon Kim, Kijeong Jeon
Abstract
To effectively support users’ goal achievement in chat-LLM services, providing user-centered follow-up questions is essential. Existing studies primarily focus on enhancing information-seeking or topical relevance, often missing how follow-up questions could satisfy users’ intrinsic needs and conversational goals. To bridge this gap, we introduce FQ-Eval, a user-centered evaluation dataset designed for assessing follow-up question generation in chat-LLM services. FQ-Eval incorporates realistic chat-LLM usage scenarios and five distinct human-aligned criteria, each reflecting user expectations of effective follow-up questions. Experimental results show that FQ-Eval constructed through our approach clearly capture human-aligned criteria, enabling robust, human-aligned follow-up question generation evaluation of various models and services.- Anthology ID:
- 2025.emnlp-industry.188
- Volume:
- Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track
- Month:
- November
- Year:
- 2025
- Address:
- Suzhou (China)
- Editors:
- Saloni Potdar, Lina Rojas-Barahona, Sebastien Montella
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 2811–2827
- Language:
- URL:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.188/
- DOI:
- Cite (ACL):
- Sanghyun Seo, Bumsoo Kang, Dahm Lee, Jaeheon Kim, Joongbo Shin, Eui Soon Kim, and Kijeong Jeon. 2025. FQ-Eval: Building Evaluation Dataset for User-centered Follow-up Question Generation. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing: Industry Track, pages 2811–2827, Suzhou (China). Association for Computational Linguistics.
- Cite (Informal):
- FQ-Eval: Building Evaluation Dataset for User-centered Follow-up Question Generation (Seo et al., EMNLP 2025)
- PDF:
- https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-industry.188.pdf