SpeedE: Euclidean Geometric Knowledge Graph Embedding Strikes Back

Aleksandar Pavlović, Emanuel Sallinger


Abstract
Geometric knowledge graph embedding models (gKGEs) have shown great potential for knowledge graph completion (KGC), i.e., automatically predicting missing triples. However, contemporary gKGEs require high embedding dimensionalities or complex embedding spaces for good KGC performance, drastically limiting their space and time efficiency. Facing these challenges, we propose SpeedE, a lightweight Euclidean gKGE that (1) provides strong inference capabilities, (2) is competitive with state-of-the-art gKGEs, even significantly outperforming them on YAGO3-10 and WN18RR, and (3) dramatically increases their efficiency, in particular, needing solely a fifth of the training time and a fourth of the parameters of the state-of-the-art ExpressivE model on WN18RR to reach the same KGC performance.
Anthology ID:
2024.findings-naacl.6
Volume:
Findings of the Association for Computational Linguistics: NAACL 2024
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
69–92
Language:
URL:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-naacl.6/
DOI:
10.18653/v1/2024.findings-naacl.6
Bibkey:
Cite (ACL):
Aleksandar Pavlović and Emanuel Sallinger. 2024. SpeedE: Euclidean Geometric Knowledge Graph Embedding Strikes Back. In Findings of the Association for Computational Linguistics: NAACL 2024, pages 69–92, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
SpeedE: Euclidean Geometric Knowledge Graph Embedding Strikes Back (Pavlović & Sallinger, Findings 2024)
Copy Citation:
PDF:
https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-naacl.6.pdf