Question-Aware Knowledge Graph Prompting for Enhancing Large Language Models

Haochen Liu, Song Wang, Chen Chen, Jundong Li


Abstract
Large Language Models (LLMs) often struggle with tasks requiring external knowledge, such as knowledge-intensive Multiple Choice Question Answering (MCQA). Integrating Knowledge Graphs (KGs) can enhance reasoning; however, existing methods typically demand costly fine-tuning or retrieve noisy KG information. Recent approaches leverage Graph Neural Networks (GNNs) to generate KG-based input embedding prefixes as soft prompts for LLMs but fail to account for question relevance, resulting in noisy prompts. Moreover, in MCQA tasks, the absence of relevant KG knowledge for certain answer options remains a significant challenge. To address these issues, we propose Question-Aware Knowledge Graph Prompting (QAP), which incorporates question embeddings into GNN aggregation to dynamically assess KG relevance. QAP employs global attention to capture inter-option relationships, enriching soft prompts with inferred knowledge. Experimental results demonstrate that QAP outperforms state-of-the-art methods across multiple datasets, highlighting its effectiveness.
Anthology ID:
2025.findings-acl.72
Volume:
Findings of the Association for Computational Linguistics: ACL 2025
Month:
July
Year:
2025
Address:
Vienna, Austria
Editors:
Wanxiang Che, Joyce Nabende, Ekaterina Shutova, Mohammad Taher Pilehvar
Venues:
Findings | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1388–1400
Language:
URL:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.72/
DOI:
Bibkey:
Cite (ACL):
Haochen Liu, Song Wang, Chen Chen, and Jundong Li. 2025. Question-Aware Knowledge Graph Prompting for Enhancing Large Language Models. In Findings of the Association for Computational Linguistics: ACL 2025, pages 1388–1400, Vienna, Austria. Association for Computational Linguistics.
Cite (Informal):
Question-Aware Knowledge Graph Prompting for Enhancing Large Language Models (Liu et al., Findings 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-acl-25/2025.findings-acl.72.pdf