The Role of Exploration Modules in Small Language Models for Knowledge Graph Question Answering

Yi-Jie Cheng, Oscar Chew, Yun-Nung Chen


Abstract
Integrating knowledge graphs (KGs) into the reasoning processes of large language models (LLMs) has emerged as a promising approach to mitigate hallucination. However, existing work in this area often relies on proprietary or extremely large models, limiting accessibility and scalability. In this study, we investigate the capabilities of existing integration methods for small language models (SLMs) in KG-based question answering and observe that their performance is often constrained by their limited ability to traverse and reason over knowledge graphs. To address this limitation, we propose leveraging simple and efficient exploration modules to handle knowledge graph traversal in place of the language model itself. Experiment results demonstrate that these lightweight modules effectively improve the performance of small language models on knowledge graph question answering tasks. Source code: https://github.com/yijie-cheng/SLM-ToG/.
Anthology ID:
2025.acl-srw.67
Volume:
Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop)
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Jin Zhao, Mingyang Wang, Zhu Liu
Venues:
ACL | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
919–928
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-srw.67/
DOI:
Bibkey:
Cite (ACL):
Yi-Jie Cheng, Oscar Chew, and Yun-Nung Chen. 2025. The Role of Exploration Modules in Small Language Models for Knowledge Graph Question Answering. In Proceedings of the 63rd Annual Meeting of the Association for Computational Linguistics (Volume 4: Student Research Workshop), pages 919–928, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
The Role of Exploration Modules in Small Language Models for Knowledge Graph Question Answering (Cheng et al., ACL 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.acl-srw.67.pdf