Search to Pass Messages for Temporal Knowledge Graph Completion

Zhen Wang, Haotong Du, Quanming Yao, Xuelong Li


Abstract
Completing missing facts is a fundamental task for temporal knowledge graphs (TKGs).Recently, graph neural network (GNN) based methods, which can simultaneously explore topological and temporal information, have become the state-of-the-art (SOTA) to complete TKGs. However, these studies are based on hand-designed architectures and fail to explore the diverse topological and temporal properties of TKG.To address this issue, we propose to use neural architecture search (NAS) to design data-specific message passing architecture for TKG completion.In particular, we develop a generalized framework to explore topological and temporal information in TKGs.Based on this framework, we design an expressive search space to fully capture various properties of different TKGs. Meanwhile, we adopt a search algorithm, which trains a supernet structure by sampling single path for efficient search with less cost.We further conduct extensive experiments on three benchmark datasets. The results show that the searched architectures by our method achieve the SOTA performances.Besides, the searched models can also implicitly reveal diverse properties in different TKGs.Our code is released in https://github.com/striderdu/SPA.
Anthology ID:
2022.findings-emnlp.458
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6160–6172
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.458
DOI:
Bibkey:
Cite (ACL):
Zhen Wang, Haotong Du, Quanming Yao, and Xuelong Li. 2022. Search to Pass Messages for Temporal Knowledge Graph Completion. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6160–6172, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Search to Pass Messages for Temporal Knowledge Graph Completion (Wang et al., Findings 2022)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingestion-script-update/2022.findings-emnlp.458.pdf