Knowledge Router: Learning Disentangled Representations for Knowledge Graphs

Shuai Zhang, Xi Rao, Yi Tay, Ce Zhang


Abstract
The design of expressive representations of entities and relations in a knowledge graph is an important endeavor. While many of the existing approaches have primarily focused on learning from relational patterns and structural information, the intrinsic complexity of KG entities has been more or less overlooked. More concretely, we hypothesize KG entities may be more complex than we think, i.e., an entity may wear many hats and relational triplets may form due to more than a single reason. To this end, this paper proposes to learn disentangled representations of KG entities - a new method that disentangles the inner latent properties of KG entities. Our disentangled process operates at the graph level and a neighborhood mechanism is leveraged to disentangle the hidden properties of each entity. This disentangled representation learning approach is model agnostic and compatible with canonical KG embedding approaches. We conduct extensive experiments on several benchmark datasets, equipping a variety of models (DistMult, SimplE, and QuatE) with our proposed disentangling mechanism. Experimental results demonstrate that our proposed approach substantially improves performance on key metrics.
Anthology ID:
2021.naacl-main.1
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–10
Language:
URL:
https://aclanthology.org/2021.naacl-main.1
DOI:
10.18653/v1/2021.naacl-main.1
Bibkey:
Cite (ACL):
Shuai Zhang, Xi Rao, Yi Tay, and Ce Zhang. 2021. Knowledge Router: Learning Disentangled Representations for Knowledge Graphs. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1–10, Online. Association for Computational Linguistics.
Cite (Informal):
Knowledge Router: Learning Disentangled Representations for Knowledge Graphs (Zhang et al., NAACL 2021)
Copy Citation:
PDF:
https://preview.aclanthology.org/add_acl24_videos/2021.naacl-main.1.pdf
Video:
 https://preview.aclanthology.org/add_acl24_videos/2021.naacl-main.1.mp4