Panfeng Zhang


Fixing paper assignments

  1. Please select all papers that do not belong to this person.
  2. Indicate below which author they should be assigned to.
Provide a valid ORCID iD here. This will be used to match future papers to this author.
Provide the name of the school or the university where the author has received or will receive their highest degree (e.g., Ph.D. institution for researchers, or current affiliation for students). This will be used to form the new author page ID, if needed.

TODO: "submit" and "cancel" buttons here


2025

pdf bib
SRM-LLM: Semantic Relationship Mining with LLMs for Temporal Knowledge Graph Extrapolation
Fu Zhang | Panfeng Zhang | Jingwei Cheng
Findings of the Association for Computational Linguistics: EMNLP 2025

Temporal knowledge graph (TKG) extrapolation aims to predict future facts by modeling the dynamic evolution of historical facts within TKGs. Existing methods often neglect the complex semantic relationships between relations when modeling their dynamic evolution, leading to incomplete relation representations and affecting the accuracy of reasoning. Inspired by the advancements in large language models (LLMs), we propose Semantic Relationship Mining based on LLMs (SRM-LLM), a novel approach for extracting semantic relationships to achieve TKG extrapolation. By leveraging LLMs to analyze the types of relations, we first identify several common relation types (e.g., causal, synonymous) in TKGs. We then design the LLM-based prompting strategy to capture latent semantic connections between relations, enabling the construction of relational association subgraphs for relation representation learning. In addition, SRM-LLM further enhances reasoning capabilities by incorporating structured logical constraints to guide inference. Experiments on five TKG datasets show significant performance gains and achieve new state of the art (SOTA) results, confirming the effectiveness of our method on TKG extrapolation tasks.