Less Is MuRE: Revisiting Shallow Knowledge Graph Embeddings

Victor Charpenay, Steven Schockaert


Abstract
In recent years, the field of knowledge graph completion has focused on increasingly sophisticated models, which perform well on link prediction tasks, but are less scalable than earlier methods and are not suitable for learning entity embeddings. As a result, shallow models such as TransE and ComplEx remain the most popular choice in many settings. However, the strengths and limitations of such models remain poorly understood. In this paper, we present a unifying framework and systematically analyze a number of variants and extensions of existing shallow models, empirically showing that MuRE and its extension, ExpressivE, are highly competitive. Motivated by the strong empirical results of MuRE, we also theoretically analyze the expressivity of its associated scoring function, surprisingly finding that it can capture the same class of rule bases as state-of-the-art region-based embedding models.
Anthology ID:
2025.emnlp-main.779
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15428–15454
Language:
URL:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.779/
DOI:
Bibkey:
Cite (ACL):
Victor Charpenay and Steven Schockaert. 2025. Less Is MuRE: Revisiting Shallow Knowledge Graph Embeddings. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 15428–15454, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Less Is MuRE: Revisiting Shallow Knowledge Graph Embeddings (Charpenay & Schockaert, EMNLP 2025)
Copy Citation:
PDF:
https://preview.aclanthology.org/ingest-emnlp/2025.emnlp-main.779.pdf
Checklist:
 2025.emnlp-main.779.checklist.pdf