Mixture of Length and Pruning Experts for Knowledge Graphs Reasoning

Enjun Du, Siyi Liu, Yongqi Zhang


Abstract
Knowledge Graph (KG) reasoning, which aims to infer new facts from structured knowledge repositories, plays a vital role in Natural Language Processing (NLP) systems. Its effectiveness critically depends on constructing informative and contextually relevant reasoning paths. However, existing graph neural networks (GNNs) often adopt rigid, query-agnostic path-exploration strategies, limiting their ability to adapt to diverse linguistic contexts and semantic nuances. To address these limitations, we propose MoKGR, a mixture-of-experts framework that personalizes path exploration through two complementary components: (1) a mixture of length experts that adaptively selects and weights candidate path lengths according to query complexity, providing query-specific reasoning depth; and (2) a mixture of pruning experts that evaluates candidate paths from a complementary perspective, retaining the most informative paths for each query. Through comprehensive experiments on diverse benchmark, MoKGR demonstrates superior performance in both transductive and inductive settings, validating the effectiveness of personalized path exploration in KGs reasoning.
Anthology ID:
2025.emnlp-main.23
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
432–453
Language:
URL:
https://preview.aclanthology.org/ingest-luhme/2025.emnlp-main.23/
DOI:
10.18653/v1/2025.emnlp-main.23
Bibkey:
Cite (ACL):
Enjun Du, Siyi Liu, and Yongqi Zhang. 2025. Mixture of Length and Pruning Experts for Knowledge Graphs Reasoning. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 432–453, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Mixture of Length and Pruning Experts for Knowledge Graphs Reasoning (Du et al., EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-luhme/2025.emnlp-main.23.pdf
Checklist:
 2025.emnlp-main.23.checklist.pdf