Question-guided Knowledge Graph Re-scoring and Injection for Knowledge Graph Question Answering

Yu Zhang, Kehai Chen, Xuefeng Bai, Zhao Kang, Quanjiang Guo, Min Zhang


Abstract
Knowledge graph question answering (KGQA) involves answering natural language questions by leveraging structured information stored in a knowledge graph. Typically, KGQA initially retrieve a targeted subgraph from a large-scale knowledge graph, which serves as the basis for reasoning models to address queries. However, the retrieved subgraph inevitably brings distraction information for knowledge utilization, impeding the model’s ability to perform accurate reasoning. To address this issue, we propose a Question-guided Knowledge Graph Re-scoring method (Q-KGR) to eliminate noisy pathways for the input question, thereby focusing specifically on pertinent factual knowledge.Moreover, we introduce Knowformer, a parameter-efficient method for injecting the re-scored knowledge graph into large language models to enhance their ability to perform factual reasoning.Extensive experiments on multiple KGQA benchmarks demonstrate the superiority of our method over existing systems.
Anthology ID:
2024.findings-emnlp.524
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8972–8985
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.524
DOI:
10.18653/v1/2024.findings-emnlp.524
Bibkey:
Cite (ACL):
Yu Zhang, Kehai Chen, Xuefeng Bai, Zhao Kang, Quanjiang Guo, and Min Zhang. 2024. Question-guided Knowledge Graph Re-scoring and Injection for Knowledge Graph Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 8972–8985, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Question-guided Knowledge Graph Re-scoring and Injection for Knowledge Graph Question Answering (Zhang et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/landing_page/2024.findings-emnlp.524.pdf