Contrastive Learning with Generated Representations for Inductive Knowledge Graph Embedding
Qian Li, Shafiq Joty, Daling Wang, Shi Feng, Yifei Zhang, Chengwei Qin
Abstract
With the evolution of Knowledge Graphs (KGs), new entities emerge which are not seen before. Representation learning of KGs in such an inductive setting aims to capture and transfer the structural patterns from existing entities to new entities. However, the performance of existing methods in inductive KGs are limited by sparsity and implicit transfer. In this paper, we propose VMCL, a Contrastive Learning (CL) framework with graph guided Variational autoencoder on Meta-KGs in the inductive setting. We first propose representation generation to capture the encoded and generated representations of entities, where the generated variations can densify representations with complementary features. Then, we design two CL objectives that work across entities and meta-KGs to simulate the transfer mode. With extensive experiments we demonstrate that our proposed VMCL can significantly outperform previous state-of-the-art baselines.- Anthology ID:
- 2023.findings-acl.900
- Volume:
- Findings of the Association for Computational Linguistics: ACL 2023
- Month:
- July
- Year:
- 2023
- Address:
- Toronto, Canada
- Editors:
- Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 14273–14287
- Language:
- URL:
- https://aclanthology.org/2023.findings-acl.900
- DOI:
- 10.18653/v1/2023.findings-acl.900
- Cite (ACL):
- Qian Li, Shafiq Joty, Daling Wang, Shi Feng, Yifei Zhang, and Chengwei Qin. 2023. Contrastive Learning with Generated Representations for Inductive Knowledge Graph Embedding. In Findings of the Association for Computational Linguistics: ACL 2023, pages 14273–14287, Toronto, Canada. Association for Computational Linguistics.
- Cite (Informal):
- Contrastive Learning with Generated Representations for Inductive Knowledge Graph Embedding (Li et al., Findings 2023)
- PDF:
- https://preview.aclanthology.org/ingest-acl-2023-videos/2023.findings-acl.900.pdf