Panfeng Zhang
2025
SRM-LLM: Semantic Relationship Mining with LLMs for Temporal Knowledge Graph Extrapolation
Fu Zhang
|
Panfeng Zhang
|
Jingwei Cheng
Findings of the Association for Computational Linguistics: EMNLP 2025
Temporal knowledge graph (TKG) extrapolation aims to predict future facts by modeling the dynamic evolution of historical facts within TKGs. Existing methods often neglect the complex semantic relationships between relations when modeling their dynamic evolution, leading to incomplete relation representations and affecting the accuracy of reasoning. Inspired by the advancements in large language models (LLMs), we propose Semantic Relationship Mining based on LLMs (SRM-LLM), a novel approach for extracting semantic relationships to achieve TKG extrapolation. By leveraging LLMs to analyze the types of relations, we first identify several common relation types (e.g., causal, synonymous) in TKGs. We then design the LLM-based prompting strategy to capture latent semantic connections between relations, enabling the construction of relational association subgraphs for relation representation learning. In addition, SRM-LLM further enhances reasoning capabilities by incorporating structured logical constraints to guide inference. Experiments on five TKG datasets show significant performance gains and achieve new state of the art (SOTA) results, confirming the effectiveness of our method on TKG extrapolation tasks.