Distance-Based Propagation for Efficient Knowledge Graph Reasoning

Harry Shomer, Yao Ma, Juanhui Li, Bo Wu, Charu Aggarwal, Jiliang Tang


Abstract
Knowledge graph completion (KGC) aims to predict unseen edges in knowledge graphs (KGs), resulting in the discovery of new facts. A new class of methods have been proposed to tackle this problem by aggregating path information. These methods have shown tremendous ability in the task of KGC. However they are plagued by efficiency issues. Though there are a few recent attempts to address this through learnable path pruning, they often sacrifice the performance to gain efficiency. In this work, we identify two intrinsic limitations of these methods that affect the efficiency and representation quality. To address the limitations, we introduce a new method, TAGNet, which is able to efficiently propagate information. This is achieved by only aggregating paths in a fixed window for each source-target pair. We demonstrate that the complexity of TAGNet is independent of the number of layers. Extensive experiments demonstrate that TAGNet can cut down on the number of propagated messages by as much as 90% while achieving competitive performance on multiple KG datasets.
Anthology ID:
2023.emnlp-main.908
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14692–14707
Language:
URL:
https://aclanthology.org/2023.emnlp-main.908
DOI:
10.18653/v1/2023.emnlp-main.908
Bibkey:
Cite (ACL):
Harry Shomer, Yao Ma, Juanhui Li, Bo Wu, Charu Aggarwal, and Jiliang Tang. 2023. Distance-Based Propagation for Efficient Knowledge Graph Reasoning. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 14692–14707, Singapore. Association for Computational Linguistics.
Cite (Informal):
Distance-Based Propagation for Efficient Knowledge Graph Reasoning (Shomer et al., EMNLP 2023)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2023.emnlp-main.908.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2023.emnlp-main.908.mp4