Joint Pre-Encoding Representation and Structure Embedding for Efficient and Low-Resource Knowledge Graph Completion

Chenyu Qiu, Pengjiang Qian, Chuang Wang, Jian Yao, Li Liu, Wei Fang, Eddie-Yin-Kwee Ng


Abstract
Knowledge graph completion (KGC) aims to infer missing or incomplete parts in knowledge graph. The existing models are generally divided into structure-based and description-based models, among description-based models often require longer training and inference times as well as increased memory usage. In this paper, we propose Pre-Encoded Masked Language Model (PEMLM) to efficiently solve KGC problem. By encoding textual descriptions into semantic representations before training, the necessary resources are significantly reduced. Furthermore, we introduce a straightforward but effective fusion framework to integrate structural embedding with pre-encoded semantic description, which enhances the model’s prediction performance on 1-N relations. The experimental results demonstrate that our proposed strategy attains state-of-the-art performance on the WN18RR (MRR+5.4% and Hits@1+6.4%) and UMLS datasets. Compared to existing models, we have increased inference speed by 30x and reduced training memory by approximately 60%.
Anthology ID:
2024.emnlp-main.851
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15257–15269
Language:
URL:
https://preview.aclanthology.org/dashboard/2024.emnlp-main.851/
DOI:
10.18653/v1/2024.emnlp-main.851
Bibkey:
Cite (ACL):
Chenyu Qiu, Pengjiang Qian, Chuang Wang, Jian Yao, Li Liu, Wei Fang, and Eddie-Yin-Kwee Ng. 2024. Joint Pre-Encoding Representation and Structure Embedding for Efficient and Low-Resource Knowledge Graph Completion. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 15257–15269, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Joint Pre-Encoding Representation and Structure Embedding for Efficient and Low-Resource Knowledge Graph Completion (Qiu et al., EMNLP 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/dashboard/2024.emnlp-main.851.pdf
Software:
 2024.emnlp-main.851.software.zip