Enhancing Hyperbolic Knowledge Graph Embeddings via Lorentz Transformations

Xiran Fan, Minghua Xu, Huiyuan Chen, Yuzhong Chen, Mahashweta Das, Hao Yang


Abstract
Knowledge Graph Embedding (KGE) is a powerful technique for predicting missing links in Knowledge Graphs (KGs) by learning the entities and relations. Hyperbolic space has emerged as a promising embedding space for KGs due to its ability to represent hierarchical data. Nevertheless, most existing hyperbolic KGE methods rely on tangent approximation and are not fully hyperbolic, resulting in distortions and inaccuracies. To overcome this limitation, we propose LorentzKG, a fully hyperbolic KGE method that represents entities as points in the Lorentz model and represents relations as the intrinsic transformation—the Lorentz transformations between entities. We demonstrate that the Lorentz transformation, which can be decomposed into Lorentz rotation/reflection and Lorentz boost, captures various types of relations including hierarchical structures. Experimental results show that our LorentzKG achieves state-of-the-art performance.
Anthology ID:
2024.findings-acl.272
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4575–4589
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.272/
DOI:
10.18653/v1/2024.findings-acl.272
Bibkey:
Cite (ACL):
Xiran Fan, Minghua Xu, Huiyuan Chen, Yuzhong Chen, Mahashweta Das, and Hao Yang. 2024. Enhancing Hyperbolic Knowledge Graph Embeddings via Lorentz Transformations. In Findings of the Association for Computational Linguistics: ACL 2024, pages 4575–4589, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Enhancing Hyperbolic Knowledge Graph Embeddings via Lorentz Transformations (Fan et al., Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-acl.272.pdf