TR-Rules: Rule-based Model for Link Forecasting on Temporal Knowledge Graph Considering Temporal Redundancy

Ningyuan Li, Haihong E, Shi Li, Mingzhi Sun, Tianyu Yao, Meina Song, Yong Wang, Haoran Luo


Abstract
Temporal knowledge graph (TKG) has been proved to be an effective way for modeling dynamic facts in real world. Many efforts have been devoted into predicting future events i.e. extrapolation, on TKGs. Recently, rule-based knowledge graph completion methods which are considered to be more interpretable than embedding-based methods, have been transferred to temporal knowledge graph extrapolation. However, rule-based models suffer from temporal redundancy when leveraged under dynamic settings, which results in inaccurate rule confidence calculation. In this paper, we define the problem of temporal redundancy and propose TR-Rules which solves the temporal redundancy issues through a simple but effective strategy. Besides, to capture more information lurking in TKGs, apart from cyclic rules, TR-Rules also mines and properly leverages acyclic rules, which has not been explored by existing models. Experimental results on three benchmarks show that TR-Rules achieves state-of-the-art performance. Ablation study shows the impact of temporal redundancy and demonstrates the performance of acyclic rules is much more promising due to its higher sensitivity to the number of sampled walks during learning stage.
Anthology ID:
2023.findings-emnlp.529
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7885–7894
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.529
DOI:
10.18653/v1/2023.findings-emnlp.529
Bibkey:
Cite (ACL):
Ningyuan Li, Haihong E, Shi Li, Mingzhi Sun, Tianyu Yao, Meina Song, Yong Wang, and Haoran Luo. 2023. TR-Rules: Rule-based Model for Link Forecasting on Temporal Knowledge Graph Considering Temporal Redundancy. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 7885–7894, Singapore. Association for Computational Linguistics.
Cite (Informal):
TR-Rules: Rule-based Model for Link Forecasting on Temporal Knowledge Graph Considering Temporal Redundancy (Li et al., Findings 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-2/2023.findings-emnlp.529.pdf