Temporal Extrapolation and Knowledge Transfer for Lifelong Temporal Knowledge Graph Reasoning

Zhongwu Chen, Chengjin Xu, Fenglong Su, Zhen Huang, Yong Dou


Abstract
Real-world Temporal Knowledge Graphs keep growing with time and new entities and facts emerge continually, necessitating a model that can extrapolate to future timestamps and transfer knowledge for new components. Therefore, our work first dives into this more realistic issue, lifelong TKG reasoning, where existing methods can only address part of the challenges. Specifically, we formulate lifelong TKG reasoning as a temporal-path-based reinforcement learning (RL) framework. Then, we add temporal displacement into the action space of RL to extrapolate for the future and further propose a temporal-rule-based reward shaping to guide the training. To transfer and update knowledge, we design a new edge-aware message passing module, where the embeddings of new entities and edges are inductive. We conduct extensive experiments on three newly constructed benchmarks for lifelong TKG reasoning. Experimental results show the outperforming effectiveness of our model against all well-adapted baselines.
Anthology ID:
2023.findings-emnlp.448
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6736–6746
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.448
DOI:
10.18653/v1/2023.findings-emnlp.448
Bibkey:
Cite (ACL):
Zhongwu Chen, Chengjin Xu, Fenglong Su, Zhen Huang, and Yong Dou. 2023. Temporal Extrapolation and Knowledge Transfer for Lifelong Temporal Knowledge Graph Reasoning. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 6736–6746, Singapore. Association for Computational Linguistics.
Cite (Informal):
Temporal Extrapolation and Knowledge Transfer for Lifelong Temporal Knowledge Graph Reasoning (Chen et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.findings-emnlp.448.pdf