Continuous Temporal Graph Networks for Event-Based Graph Data

Jin Guo, Zhen Han, Su Zhou, Jiliang Li, Volker Tresp, Yuyi Wang


Abstract
There has been an increasing interest in modeling continuous-time dynamics of temporal graph data. Previous methods encode time-evolving relational information into a low-dimensional representation by specifying discrete layers of neural networks, while real-world dynamic graphs often vary continuously over time. Hence, we propose Continuous Temporal Graph Networks (CTGNs) to capture continuous dynamics of temporal graph data. We use both the link starting timestamps and link duration as evolving information to model continuous dynamics of nodes. The key idea is to use neural ordinary differential equations (ODE) to characterize the continuous dynamics of node representations over dynamic graphs. We parameterize ordinary differential equations using a novel graph neural network. The existing dynamic graph networks can be considered as a specific discretization of CTGNs. Experiment results on both transductive and inductive tasks demonstrate the effectiveness of our proposed approach over competitive baselines.
Anthology ID:
2022.dlg4nlp-1.3
Volume:
Proceedings of the 2nd Workshop on Deep Learning on Graphs for Natural Language Processing (DLG4NLP 2022)
Month:
July
Year:
2022
Address:
Seattle, Washington
Editors:
Lingfei Wu, Bang Liu, Rada Mihalcea, Jian Pei, Yue Zhang, Yunyao Li
Venue:
DLG4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
22–29
Language:
URL:
https://aclanthology.org/2022.dlg4nlp-1.3
DOI:
10.18653/v1/2022.dlg4nlp-1.3
Bibkey:
Cite (ACL):
Jin Guo, Zhen Han, Su Zhou, Jiliang Li, Volker Tresp, and Yuyi Wang. 2022. Continuous Temporal Graph Networks for Event-Based Graph Data. In Proceedings of the 2nd Workshop on Deep Learning on Graphs for Natural Language Processing (DLG4NLP 2022), pages 22–29, Seattle, Washington. Association for Computational Linguistics.
Cite (Informal):
Continuous Temporal Graph Networks for Event-Based Graph Data (Guo et al., DLG4NLP 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/nschneid-patch-4/2022.dlg4nlp-1.3.pdf
Data
Reddit