INTERACT: Enabling Interactive, Question-Driven Learning in Large Language Models

Aum Kendapadi, Kerem Zaman, Rakesh R Menon, Shashank Srivastava


Abstract
Large language models (LLMs) excel at answering questions but remain passive learners—absorbing static data without the ability to question and refine knowledge. This paper explores how LLMs can transition to interactive, question-driven learning through student-teacher dialogues. We introduce INTERACT (INTERactive learning for Adaptive Concept Transfer), a framework in which a “student” LLM engages a “teacher” LLM through iterative inquiries to acquire knowledge across 1,347 contexts, including song lyrics, news articles, movie plots, academic papers, and images. Our experiments show that across a wide range of scenarios and LLM architectures, interactive learning consistently enhances performance, achieving up to a 25% improvement, with ‘cold-start’ student models matching static learning baselines in as few as five dialogue turns. Interactive setups can also mitigate the disadvantages of weaker teachers, showcasing the robustness of question-driven learning.
Anthology ID:
2025.acl-long.441
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8992–9024
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.441/
DOI:
Bibkey:
Cite (ACL):
Aum Kendapadi, Kerem Zaman, Rakesh R Menon, and Shashank Srivastava. 2025. INTERACT: Enabling Interactive, Question-Driven Learning in Large Language Models. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 8992–9024, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
INTERACT: Enabling Interactive, Question-Driven Learning in Large Language Models (Kendapadi et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-long.441.pdf