Abstract
The primary aim of Knowledge Graph Embeddings (KGE) is to learn low-dimensional representations of entities and relations for predicting missing facts. While rotation-based methods like RotatE and QuatE perform well in KGE, they face two challenges: limited model flexibility requiring proportional increases in relation size with entity dimension, and difficulties in generalizing the model for higher-dimensional rotations. To address these issues, we introduce OrthogonalE, a novel KGE model employing matrices for entities and block-diagonal orthogonal matrices with Riemannian optimization for relations. This approach not only enhances the generality and flexibility of KGE models but also captures several relation patterns that rotation-based methods can identify. Experimental results indicate that our new KGE model, OrthogonalE, offers generality and flexibility, captures several relation patterns, and significantly outperforms state-of-the-art KGE models while substantially reducing the number of relation parameters.- Anthology ID:
- 2024.findings-emnlp.987
- Volume:
- Findings of the Association for Computational Linguistics: EMNLP 2024
- Month:
- November
- Year:
- 2024
- Address:
- Miami, Florida, USA
- Editors:
- Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
- Venue:
- Findings
- SIG:
- Publisher:
- Association for Computational Linguistics
- Note:
- Pages:
- 16956–16972
- Language:
- URL:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.987/
- DOI:
- 10.18653/v1/2024.findings-emnlp.987
- Cite (ACL):
- Yihua Zhu and Hidetoshi Shimodaira. 2024. Block-Diagonal Orthogonal Relation and Matrix Entity for Knowledge Graph Embedding. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 16956–16972, Miami, Florida, USA. Association for Computational Linguistics.
- Cite (Informal):
- Block-Diagonal Orthogonal Relation and Matrix Entity for Knowledge Graph Embedding (Zhu & Shimodaira, Findings 2024)
- PDF:
- https://preview.aclanthology.org/build-pipeline-with-new-library/2024.findings-emnlp.987.pdf