Abstract
There has been an increasing interest in inferring future links on temporal knowledge graphs (KG). While links on temporal KGs vary continuously over time, the existing approaches model the temporal KGs in discrete state spaces. To this end, we propose a novel continuum model by extending the idea of neural ordinary differential equations (ODEs) to multi-relational graph convolutional networks. The proposed model preserves the continuous nature of dynamic multi-relational graph data and encodes both temporal and structural information into continuous-time dynamic embeddings. In addition, a novel graph transition layer is applied to capture the transitions on the dynamic graph, i.e., edge formation and dissolution. We perform extensive experiments on five benchmark datasets for temporal KG reasoning, showing our model’s superior performance on the future link forecasting task.- Anthology ID:
- 2021.emnlp-main.658
- Volume:
- Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
- Month:
- November
- Year:
- 2021
- Address:
- Online and Punta Cana, Dominican Republic
- Editors:
- Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
- Venue:
- EMNLP
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 8352–8364
- Language:
- URL:
- https://aclanthology.org/2021.emnlp-main.658
- DOI:
- 10.18653/v1/2021.emnlp-main.658
- Cite (ACL):
- Zhen Han, Zifeng Ding, Yunpu Ma, Yujia Gu, and Volker Tresp. 2021. Learning Neural Ordinary Equations for Forecasting Future Links on Temporal Knowledge Graphs. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8352–8364, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
- Cite (Informal):
- Learning Neural Ordinary Equations for Forecasting Future Links on Temporal Knowledge Graphs (Han et al., EMNLP 2021)
- PDF:
- https://preview.aclanthology.org/nschneid-patch-1/2021.emnlp-main.658.pdf
- Data
- ICEWS